Working Papers
By Year:
Paper #  Author  Title  

16022 
Laura Liu Hyungsik Roger Moon Frank Schorfheide 
Forecasting with Dynamic Panel Data Models  
This paper considers the problem of forecasting a collection of short time series
using cross sectional information in panel data. We construct point predictors using
Tweedie's formula for the posterior mean of heterogeneous coefficients under a correlated random effects distribution. This formula utilizes crosssectional information to transform the unitspecific (quasi) maximum likelihood estimator into an approximation of the posterior mean under a prior distribution that equals the population distribution of the random coefficients. We show that the risk of a predictor based on a nonparametric estimate of the Tweedie correction is asymptotically equivalent to the risk of a predictor that treats the correlatedrandomeffects distribution as known (ratiooptimality). Our empirical Bayes predictor performs well compared to various competitors in a Monte Carlo study. In an empirical application we use the predictor to forecast revenues for a large panel of bank holding companies and compare forecasts that condition on actual and severely adverse macroeconomic conditions. Download Paper


16017 
Edward Herbst Frank Schorfheide 
Tempered Particle Filtering  
The accuracy of particle filters for nonlinear statespace models crucially depends on the proposal distribution that mutates time t − 1 particle values into time t values. In the widelyused bootstrap particle filter this distribution is generated by the state transition equation. While straightforward to implement, the practical performance is often poor. We develop a selftuning particle filter in which the proposal distribution is constructed adaptively through a sequence of Monte Carlo steps. Intuitively, we start from a measurement error distribution with an inflated variance, and then gradually reduce the variance to its nominal level in a sequence of steps that we call tempering. We show that the filter generates an unbiased and consistent approximation of the likelihood function. Holding the run time fixed, our filter is substantially more accurate in two DSGE model applications than the bootstrap particle filter. Download Paper


15042 
Jesus FernandezVillaverde Juan F. RubioRamírez Frank Schorfheide 
Solution and Estimation Methods for DSGE Models  
This chapter provides an overview of solution and estimation techniques for dynamic stochastic general equilibrium (DSGE) models. We cover the foundations of numerical approximation techniques as well as statistical inference and survey the latest developments in the field. Download Paper


15018 
Francis X. Diebold Frank Schorfheide Minchul Shin 
"RealTime Forecast Evaluation of DSGE Models with Stochastic Volatility"  
Recent work has analyzed the forecasting performance of standard dynamic
stochastic general equilibrium (DSGE) models, but little attention has been given to DSGE models that incorporate nonlinearities in exogenous driving processes. Against that background, we explore whether incorporating stochastic volatility improves DSGE forecasts (point, interval, and density). We examine realtime forecast accuracy for key macroeconomic variables including output growth, inflation, and the policy rate. We find that incorporating stochastic volatility in DSGE models of macroeconomic fundamentals markedly improves their density forecasts, just as incorporating stochastic volatility in models of financial asset returns improves their density forecasts. Download Paper


14035 
S. Boragan Aruoba Pablo Cuba Borda Frank Schorfheide 
"Macroeconomic Dynamics Near the ZLB: A Tale of Two Countries"  
We propose and solve a smallscale NewKeynesian model with Markov sunspot shocks that move the economy between a targetedinflation regime and a deflation regime and fit it to data from the U.S. and Japan. For the U.S. we find that adverse demand shocks have moved the economy to the zero lower bound (ZLB) in 2009 and an expansive monetary policy has kept it there subsequently. In contrast, Japan has experienced a switch to the deflation regime in 1999 and remained there since then, except for a short period. The two scenarios have drastically different implications for macroeconomic policies. Fiscal multipliers are about 20% smaller in the deflationary regime, despite the economy remaining at the ZLB. While a commitment by the central bank to keep rates near the ZLB doubles the fiscal multipliers in the targetedinflation regime (U.S.), it has no effect in the deflation regime (Japan). Download Paper


14034 
Marco Del Negro Raiden Hasegawa Frank Schorfheide 
"Dynamic Prediction Pools: An Investigation of Financial Frictions and Forecasting Performance"  
We provide a novel methodology for estimating timevarying weights in linear prediction pools, which we call Dynamic Pools, and use it to investigate the relative forecasting performance of DSGE models with and without financial frictions for output growth and inflation from 1992 to 2011. We find strong evidence of time variation in the pool's weights, reflecting the fact that the DSGE model with financial frictions produces superior forecasts in periods of financial distress but does not perform as well in tranquil periods. The dynamic pool's weights react in a timely fashion to changes in the environment, leading to realtime forecast improvements relative to other methods of density forecast combination, such as Bayesian Model Averaging, optimal (static) pools, and equal weights. We show how a policymaker dealing with model uncertainty could have used a dynamic pools to perform a counterfactual exercise (responding to the gap in labor market conditions) in the immediate aftermath of the Lehman crisis. Download Paper


13059 
Frank Schorfheide Kenneth I. Wolpin 
"To Hold Out or Not to Hold Out"  
A recent literature has developed that combines two prominent empirical approaches to ex ante policy evaluation: randomized controlled trials (RCT) and structural estimation. The RCT provides a “goldstandard" estimate of a particular treatment, but only of that treatment. Structural estimation provides the capability to extrapolate beyond the experimental treatment, but is based on untestable assumptions and is subject to structural data mining. Combining the approaches by holding out from the structural estimation exercise either the treatment or control sample allows for external validation of the underlying behavioral model. Although intuitively appealing, this holdout methodology is not well grounded. For instance, it is easy to show that it is suboptimal from a Bayesian perspective. Using a stylized representation of a randomized controlled trial, we provide a formal rationale for the use of a holdout sample in an environment in which data mining poses an impediment to the implementation of the ideal Bayesian analysis and a numerical illustration of the potential benefits of holdout samples. Download Paper


13016 
S. Boragan Aruoba Francis X. Diebold Jeremy Nalewaik Frank Schorfheide Dongho Song 
"Improving GDP Measurement: A MeasurementError Perspective"  
We provide a new and superior measure of U.S. GDP, obtained by applying optimal signalextraction techniques to the (noisy) expenditureside and incomeside estimates. Its properties  particularly as regards serial correlation  differ markedly from those of the standard expenditureside measure and lead to substantiallyrevised views regarding the properties of GDP. Download Paper


12020 
Fei Chen Francis X. Diebold Frank Schorfheide 
"A MarkovSwitching MultiFractal InterTrade Duration Model, with Application to U.S. Equities"  
We propose and illustrate a Markovswitching multifractal duration (MSMD) model for analysis of intertrade durations in financial markets. We establish several of its key properties with emphasis on high persistence (indeed long memory). Empirical exploration suggests MSMD's superiority relative to leading competitors. Download Paper


11028 
S. Boragan Aruoba Francis X. Diebold Jeremy Nalewaik Frank Schorfheide Dongho Song 
"Improving GDP Measurement: A Forecast Combination Perspective"  
Two oftendivergent U.S. GDP estimates are available, a widelyused expenditure side version, GDPE, and a much less widelyused incomeside version, GDPI . We propose and explore a "forecast combination" approach to combining them. We then put the theory to work, producing a superior combined estimate of GDP growth for the U.S., GDPC. We compare GDPC to GDPE and GDPI, with particular attention to behavior over the business cycle. We discuss several variations and extensions. Download Paper


02025 
Thomas A. Lubik Frank Schorfheide 
"Testing for Indeterminacy: An Application to U. S. Monetary Policy"  
This paper considers a prototypical monetary business cycle model for the U.S. economy, in which the equilibrium is undetermined if monetary policy is 'inactive'. In previous multivariate studies it has been common practice to restrict parameter estimates to values for which the equilibrium is unique. We show how the likelihoodbased estimation of dynamic stochastic general equilibrium models can be extended to allow for indeterminacies and sunspot fluctuations. We propose a posterior odds test for the hypothesis that the data are best explained by parameters that imply determinacy. Our empirical results show that the VolckerGreenspan policy regime is consistent with determinacy, whereas the preVolcker regime is not. We find that before 1979 nonfundamental sunspot shocks may have contributed significantly to inflation and interest rate volatility, but essentially did not affect output fluctuations. Download Paper


02024 
Marco Del Negro Frank Schorfheide 
"Priors from General Equilibrium Models for VARs"  
This paper uses a simple NewKeynesian monetary DSGE model as a prior for a VAR, shows that the resulting model is competitive with standard benchmarks in terms of forecasting, and can be used for policy analysis. Download Paper


01047 
Thomas A. Lubik Frank Schorfheide 
"Computing Sunspots in Linear Rational Expectations Models"  
We provide a computationally simple method of including and analyzing the effects of sunspot shocks in linear rational expectations models when the equilibrium is indeterminate. Under non uniqueness sunspots can affect model dynamics through endogenous forecast errors that do not completely adjust to fundamental shocks alone. We show that sunspot shocks can ire modeled as exogenous belief shocks which can be included in the set of fundamentals. By means of a simple example we illustrate that the exact specification of the transmission mechanism of the belief shocks is irrelevant for the solution of the model. Download Paper


01023 
Yongsung Chang Joao Gomes Frank Schorfheide 
"LearningbyDoing as a Propagation Mechanism"  
This paper suggests that skill accumulation through past work experience, or "learningbydoing", can provide an important propagation mechanism for initial shocks, as the current labor supply affects future productivity. Our econometric analysis uses a Bayesian approach to combine microlevel panel data with aggregate time series. Formal model evaluation shows that the introduction of the LBD mechanism improves the model's ability to fit the dynamics of aggregate output and hours. Download Paper


01016 
Hyungsik Roger Moon Frank Schorfheide 
"Minimum Distance Estimation of Nonstationary Time Series Models"  
This paper establishes the consistency and limit distribution of minimum distance (MD) estimators for time series models with deterministic or stochastic trends. We consider models that are linear in the variables, but involve nonlinear restrictions across parameters. Two complications arise. First, the unrestricted and restricted parameter space have to be rotated to separate fast converging components of the MD estimator from slowly converging components. Second, if the model includes stochastic trends it is desirable to use a random matrix to weigh the discrepancy between the unrestricted and restricted parameter estimates. In this case, the objective function of the MD estimator has a stochastic limit. We provide regularity conditions for the nonlinear restriction function that are easier to verify than the stochastic equicontinuity conditions that typically arise from direct estimation of the restricted parameters. We derive the optimal weight matrix when the limit distribution of the unrestricted estimator is mixed normal and propose a goodnessof t test based on overidentifying restrictions. As applications, we investigate cointegration regression models, presentvalue models, and a permanentincome model based on a linearquadratic dynamic programming problem. Download Paper


00002 
Yongsung Chang Frank Schorfheide 
"Labor Supply Shifts and Economics Fluctuations"  
We investigate the role of laborsupply shifts in economic fluctuations. A new VAR identification scheme for labor supply shocks is proposed. Our method provides an alternative identification scheme, which does not rely on "zerorestrictions" . According to our VAR analysis of postwar U .S. data, laborsupply shifts account for about half the variation in hours and onefifth of variation in aggregate output. To assess the role of laborsupply shifts in a more structural framework, estimates from a dynamic stochastic general equilibrium (DSGE) model with stochastic variation in home production technology are compared to those from the VAR. Download Paper


00001 
Yongsung Chang Joao Gomes Frank Schorfheide 
"Persistence"  
To generate persistence we augment the standard real business cycle (RBC) model with a "learning by doing" (LBD) mechanism, where current labor supply affects workers' future labor productivity. Our econometric analysis shows that the LBD model fits aggregate data much better than the standard RBC model. We calculate posterior odds for the structural models and formally show that the LBD model more closely mimics the autocorrelation and impulse response patterns that we found in a bivariate VAR analysis. Download Paper


99007 
Frank Schorfheide 
"A Unified Econometric Framework for the Evaluation of DSGE Models"  
A unified Bayesian framework for the econometric evaluation of dynamic stochastic general equilibrium (DSGE) models is presented. The evaluation is coherent under misspecification, that is, low posterior probability of all DSGE models in a candidate set, as well as no misspecification. The framework encompasses many of the existing evaluation schemes as special cases, including Kydland and Prescott's (1996) informal calibration and the traditional macroeconometric approach of judging models according to their ability to track and forecast aggregate time series. A detailed illustrative application of the framework to a standard cashinadvance model and a liquidity model is provided. The models are evaluated according to their predictions of comovements between output growth and inflation, and responses to discretionary changes in the growth rate of money supply. Download Paper


99006 
Frank Schorfheide 
"Loss Function vs. Likelihood Estimation of Forecasting Models: A Pretest Procedure and a Bayesian Interpretation"  
The paper considers the problem of using a vector autoregression (VAR) to forecast a stationary process several periods into the future. If the V AR is misspecified, it might be best to use the loss function under which the forecasts are evaluated also for parameter estimation. It is a plausible and straightforward procedure to conduct a model check of the V AR before adopting a loss function estimator. If the V AR is discredited by the data then a loss function estimator is used, otherwise the parameters are estimated by a likelihood based technique. We calculate the asymptotic prediction risk for such a pretest procedure under the assumptions that the data are generated from a linear process that drifts toward the VAR as the sample size tends to infinity. The pretest can avoid picking the inferior estimator when the stakes are high. This is confirmed by a small Monte Carlo study. A Bayesian interpretation of loss function estimation and the pretest procedure is provided. A forecaster places nonzero prior probability on a reference model but finds it too onerous to calculate its posterior predictive distribution. Instead he chooses a prediction procedure based on the VAR that has a small integrated prediction risk. Download Paper
