Paper # Author Title
We estimate a model with latent factors that summarize the yield curve (namely, level, slope, and curvature) as well as observable macroeconomic variables (real activity, inflation, and the stance of monetary policy). Our goal is to provide a characterization of the dynamic interactions between the macroeconomy and the yield curve. We find strong evidence of the effects of macro variables on future movements in the yield curve and much weaker evidence for a reverse influence. We also relate our results to a traditional macroeconomic approach based on the expectations hypothesis. Download Paper
We extend range-based volatility estimation to the multivariate case. In particular, we propose a range-based covariance estimator motivated by a key financial economic consideration, the absence of arbitrage, in addition to statistical considerations. We show that this estimator is highly efficient yet robust to market microstructure noise arising from bid-ask bounce and asynchronous trading. Download Paper
We take a nonstructural time-series approach to modeling and forecasting daily average temperature in ten U.S. cities, and we inquire systematically as to whether it may prove useful from the vantage point of participants in the weather derivatives market. The answer is, perhaps surprisingly, yes. Time series modeling reveals both strong conditional mean dynamics and conditional variance dynamics in daily average temperature, and it reveals sharp differences between the distribution of temperature and the distribution of temperature surprises. Most importantly, it adapts readily to produce the long-horizon forecasts of relevance in weather derivatives contexts. We produce and evaluate both point and distributional forecasts of average temperature, with some success. We conclude that additional inquiry into nonstructural weather forecasting methods, as relevant for weather derivatives, will likely prove useful. Download Paper
Despite powerful advances in yield curve modeling in the last twenty years, comparatively little attention has been paid to the key practical problem of forecasting the yield curve. In this paper we do so. We use neither the no-arbitrage approach, which focuses on accurately fitting the cross section of interest rates at any given time but neglects time-series dynamics, nor the equilibrium approach, which focuses on time-series dynamics (primarily those of the instantaneous rate) but pays comparatively little attention to fitting the entire cross section at any given time and has been shown to forecast poorly. Instead, we use variations on the Nelson-Siegel exponential components framework to model the entire yield curve, period-by-period, as a three-dimensional parameter evolving dynamically. We show that the three time-varying parameters may be interpreted as factors corresponding to level, slope and curvature, and that they may be estimated with high efficiency. We propose and estimate autoregressive models for the factors, and we show that our models are consistent with a variety of stylized facts regarding the yield curve. We use our models to produce term-structure forecasts at both short and long horizons, with encouraging results. In particular, our forecasts appear much more accurate at long horizons than various standard benchmark forecasts. Finally, we discuss a number of extensions, including generalized duration measures, applications to active bond portfolio management, and arbitrage-free specifications. Download Paper
Volatility has been one of the most active areas of research in empirical finance and time series econometrics during the past decade. This chapter provides a unified continuous-time, frictionless, no-arbitrage framework for systematically categorizing the various volatility concepts, measurement procedures, and modeling procedures. We define three different volatility concepts: (i) the notional volatility corresponding to the ex-post sample-path return variability over a fixed time interval, (ii) the ex-ante expected volatility over a fixed time interval, and (iii) the instantaneous volatility corresponding to the strength of the volatility process at a point in time. The parametric procedures rely on explicit functional form assumptions regarding the expected and/or instantaneous volatility. In the discrete-time ARCH class of models, the expectations are formulated in terms of directly observable variables, while the discrete- and continuous-time stochastic volatility models involve latent state variable(s). The nonparametric procedures are generally free from such functional form assumptions and hence afford estimates of notional volatility that are flexible yet consistent (as the sampling frequency of the underlying returns increases). The nonparametric procedures include ARCH filters and smoothers designed to measure the volatility over infinitesimally short horizons, as well as the recently-popularized realized volatility measures for (non-trivial) fixed-length time intervals. Download Paper
Using a new dataset consisting of six years of real-time exchange rate quotations, macroeconomic expectations, and macroeconomic realizations (announcements), we characterize the conditional means of U.S. dollar spot exchange rates versus German Mark, British Pound, Japanese Yen, Swiss Franc, and the Euro. In particular, we find that announcement surprises (that is, divergences between expectations and realizations, or "news") produce conditional mean jumps; hence high-frequency exchange rate dynamics are linked to fundamentals. The details of the linkage are intriguing and include announcement timing and sign effects. The sign effect refers to the fact that the market reacts to news in an asymmetric fashion: bad news has greater impact than good news, which we relate to recent theoretical work on information processing and price discovery. Download Paper
Weather derivatives are a fascinating new type of Arrow-Debreu security, making pre-specified payouts if pre-specified weather events occur, and the market for such derivatives has grown rapidly. Weather modeling and forecasting are crucial to both the demand and supply sides of the weather derivatives market. On the demand side, to assess the potential for hedging against weather surprises and to formulate the appropriate hedging strategies, one needs to determine how much "weather noise" exists for weather derivatives to eliminate, and that requirees weather modeling and forecasting. On the supply side, standard approaches to arbitrage-free pricing are irrelevant in weather derivative contexts, and so the only way to price options reliably is again by modeling and forecasting the underlying weather variable. Curiously, however, little thought has been given to the crucial question of how best to approach weather modeling and forecasting in the context of weather derivative demand and supply. The vast majority of extant weather forecasting literature has a structural "atmospheric science" feel, and although such an approach may be best for forecasting six hours ahead, it is not obvious that it is best for the longer horizons relevant for weather derivatives, such as six days, six weeks, or six months. In particular, good forecasting does not necessarily require a structural model. In this paper, then, we take a seemingly-naive nonstructural time-series approach to modeling and forecasting daily average temperature in ten U.S. cities, and we inquire systematically as to whether it proves useful. The answer is, perhaps surprisingly, "yes." Time series modeling reveals a wealth of information about both conditional mean dynamics and the conditional variance dynamics of average daily temperature, some of which seems not to have been noticed previously, and it provides similarly sharp insights into both the distributions of weather and the distributions of weather surprises, and the key differences between them. The success of time-series modeling in capturing conditional mean dynamics translates into successful point forecasting, a fact which, together with the success of time-series modeling in identifying and characterizing the distributions of weather surprises, translates as well into successful density forecasting. Download Paper
The prescriptions of modern financial risk management hinge critically on the associated characterization of the distribution of future returns (cf., Diebold, Gunther and Tay, 1998, and Diebold, Hahn and Tay, 1999). Because volatility persistence renders high-frequency returns temporally dependent (e.g., Bollerslev, Chou and Kroner, 1992), it is the conditional return distribution, and not the unconditional distribution, that is of relevance for risk management. This is especially true in high-frequency situations, such as monitoring and managing the risk associated with the day-to-day operations of a trading desk, where volatility clustering is omnipresent. Download Paper
We propose using the price range in the estimation of stochastic volatility models. We show theoretically, numerically, and empirically that the range is not only a highly efficient volatility proxy, but also that it is approximately Gaussian and robust to microstructure noise. The good properties of the range imply that range-based Gaussian quasi-maximum likelihood estimation produces simple and highly efficient estimates of stochastic volatility models and extractions of latent volatility series. We use our method to examine the dynamics of daily exchange rate volatility and discover that traditional one-factor models are inadequate for describing simultaneously the high- and low-frequency dynamics of volatility. Instead, the evidence points strongly toward two-factor models with one highly persistent factor and one quickly mean-reverting factor. Download Paper
The huge theoretical and empirical econometric literatures on long memory and on structural change have evolved largely independently, as the phenomena appear distinct. We argue, in contrast, that they are intimately related. In particular, we show analytically that stochastic regime switching is easily confused with long memory, so long as only a “small” amount of regime switching occurs (in a sense that we make precise). A Monte Carlo analysis supports the relevance of the asymptotic theory in finite samples and produces additional insights. Download Paper
We propose a measure of predictability based on the ratio of the expected loss of a short-run forecast to the expected loss of a long-run forecast. This predictability measure can be tailored to the forecast horizons of interest, and it allows for general loss functions, univariate or multivariate information sets, and covariance stationary or difference stationary processes. We propose a simple estimator, and we suggest resampling methods for inference. We then provide several macroeconomic applications. First, we illustrate the implementation of predictability measures based on fitted parametric models for several U.S. macroeconomic time series. Second, we analyze the internal propagation mechanism of a standard dynamic macroeconomic model by comparing the predictability of model inputs and model outputs. Third, we use predictability as a metric for assessing the similarity of data simulated from the model and actual data. Finally, we outline several nonparametric extensions of our approach. Download Paper
The turmoil in the capital markets in 1997 and 1998 has highlighted the need for systematic stress testing of banks’ portfolios, including both their trading and lending books. We propose that underlying macroeconomic volatility is a key part of a useful conceptual framework for stress testing credit portfolios, and that credit migration matrices provide the specific linkages between underlying macroeconomic conditions and asset quality. Credit migration matrices, which characterize the expected changes in credit quality of obligors, are cardinal inputs to many applications, including portfolio risk assessment, modeling the term structure of credit risk premia, and pricing of credit derivatives. They are also an integral part of many of the credit portfolio models used by financial institutions. By separating the economy into two states or regimes, expansion and contraction, and conditioning the migration matrix on these states, we show that the loss distribution of credit portfolios can differ greatly, as can the concomitant level of economic capital to be assigned. Download Paper
Using high-frequency data on Deutschemark and Yen returns against the dollar, we construct model-free estimates of daily exchange rate volatility and correlation, covering an entire decade. Our estimates, termed realized volatilities and correlations, are not only model-free, but also approximately free of measurement error under general conditions, which we discuss in detail. Hence, for practical purposes, we may treat the exchange rate volatilities and correlations as observed rather than latent. We do so, and we characterize their joint distribution, both unconditionally and conditionally. Noteworthy results include a simple normality-inducing volatility transformation, high contemporaneous correlation across volatilities, high correlation between correlation and volatilities, pronounced and persistent dynamics in volatilities and correlations, evidence of long-memory dynamics in volatilities and correlations, and remarkably precise scaling laws under temporal aggregation. Download Paper
</p> This paper provides a general framework for integration of high-frequency intraday data into the measurement, modeling, and forecasting of daily and lower frequency volatility and return distributions.  Most procedures for modeling and forecasting financial asset return volatilities, correlations, and distributions rely on restrictive and complicated parametric multivariate ARCH or stochastic volatility models, which often perform poorly at intraday frequencies. Use of realized volatility constructed from high-frequency intraday returns, in contrast, permits the use of traditional time series procedures for modeling and forecasting. Building on the theory of continuous-time arbitrage-free price processes and the theory of quadratic variation, we formally develop the links between the conditional covariance matrix and the concept of realized volatility. Next, using continuously recorded observations for the Deutschemark / Dollar and Yen / Dollar spot exchange rates covering more than a decade, we find that forecasts from a simple long-memory Gaussian vector autoregression for the logarithmic daily realized volatilities perform admirably compared to popular daily ARCH and related models. Moreover, the vector autoregressive volatility forecast, coupled with a parametric lognormal-normal mixture distribution implied by the theoretically and empirically grounded assumption of normally distributed standardized returns, gives rise to well-calibrated density forecasts of future returns, and correspondingly accurate quantile estimates. Our results hold promise for practical modeling and forecasting of the large covariance matrices relevant in asset pricing, asset allocation and financial risk management applications. Download Paper
We exploit direct model-free measures of daily equity return volatility and correlation obtained from high-frequency intraday transaction prices on individual stocks in the Dow Jones Industrial Average over a five-year period to confirm, solidify and extend existing characterizations of stock return volatility and correlation. We find t hat t he unconditional distributions of the variances and covariances for all thirty stocks are leptokurtic and highly skewed to the right, while the logarithmic standard deviations and correlations all appear approximately Gaussian. Moreover, the distributions of the returns scaled by the realized standard deviations are also Gaussian. Consistent with our documentation of remarkably precise scaling laws under temporal aggregation, the realized logarithmic standard deviations and correlations all show strong temporal dependence and appear to be well described by long-memory processes. Positive returns have less impact on future variances and correlations than negative returns of the same absolute magnitude, although the economic importance of this asymmetry is minor. Finally, there is strong evidence that equity volatilities and correlations move together, possibly reducing the benefits to portfolio diversification when the market is most volatile. Our findings are broadly consistent with a latent volatility factor structure, and they set the stage for improved high-dimensional volatility modeling and out-of-sample forecasting, which in turn hold promise for the development of better decision making in practical situations of risk management, portfolio allocation, and asset pricing. Download Paper
Broadly defined, macroeconomic forecasting is alive and well. Nonstructural forecasting which is based largely on reduced-form correlations, has always been well and continues to" improve. Structural forecasting, which aligns itself with economic theory and hence rises and" falls with theory, receded following the decline of Keynesian theory. In recent years powerful new dynamic stochastic general equilibrium theory has been developed macroeconomic forecasting is poised for resurgence. Download Paper
We show that the common practice of converting 1-day volatility estimates to h-day estimates by scaling by the sqaure root of h is inappropriate and produces overestimates of the variability of long-horizon volatility. We conclude that volatility models are best tailored to tasks: if interest centers on long-horizon volatility, then a long-horizon volatility model should be used. Economic considerations, however, confound even that prescription and point to important directions for future research. Download Paper
We propose a constructive, multivariate framework for assessing agreement between (generally misspecified) dynamic equilibrium models and data, a framework which enables a complete second-order comparison of the dynamic properties of models and data. We use bootstrap algorithms to evaluate the significance of deviations between models and data, and we use goodness-of-fit criteria to produce estimators that optimize economically relevant loss functions. We provide a detailed illustrative application to modeling the U.S. cattle cycle. Download Paper
Prediction problems involving asymmetric loss functions arise routinely in many fields, yet the theory of optimal prediction under asymmetric loss is not well developed. We study the optimal prediction problem under general loss structures and characterize the optimal predictor. We compute the optimal predictor analytically in two leading cases. Analytic solutions for the optimal predictor are not available in more complicated cases, so we develop numerical procedures for computing it. We illustrate the results by forecasting the GARCH(1,1) process which, although white noise, is non-trivially forecastable under asymmetric loss. Download Paper
We propose several methods for evaluating and improving density forecasts. We focus primarily on methods that are applicable regardless of the particular user s loss function, but we also show how to use information about the loss function when and if it is known. Throughout, we take explicit account of the relationships between density forecasts, action choices, and the corresponding expected loss. Download Paper
It is widely believed that imposing cointegration on a forecasting system, if cointegration is in fact present, will improve long-horizon forecasts. Contrary to this belief, we show that at long horizons nothing is lost by ignoring cointegration when the forecasts are evaluated using standard multivariate forecast accuracy measures. In fact, simple univariate Box-Jenkins forecasts are just as accurate. Our results highlight a potentially important deficiency of standard forecast accuracy measures -- they fail to value the maintenance of cointegrating relationships among variables -- and we suggest alternatives that explicitly do so. Download Paper