Paper # Author Title
This chapter provides an overview of solution and estimation techniques for dynamic stochastic general equilibrium (DSGE) models. We cover the foundations of numerical approximation techniques as well as statistical inference and survey the latest developments in the field. Download Paper
We propose a novel method to estimate dynamic equilibrium models with stochastic volatility. First, we characterize the properties of the solution to this class of models. Second, we take advantage of the results about the structure of the solution to build a sequential Monte Carlo algorithm to evaluate the likelihood function of the model. The approach, which exploits the profusion of shocks in stochastic volatility models, is versatile and computationally tractable even in large-scale models, such as those often employed by policy-making institutions. As an application, we use our algorithm and Bayesian methods to estimate a business cycle model of the U.S. economy with both stochastic volatility and parameter drifting in monetary policy. Our application shows the importance of stochastic volatility in accounting for the dynamics of the data. Download Paper
We study the effects of changes in uncertainty about future fiscal policy on aggregate economic activity. Fiscal deficits and public debt have risen sharply in the wake of the financial crisis. While these developments make fisscal consolidation inevitable, there is considerable uncertainty about the policy mix and timing of such budgetary adjustment. To evaluate the consequences of this increased uncertainty, we first estimate tax and spending processes for the U.S. that allow for time-varying volatility. We then feed these processes into an otherwise standard New Keynesian business cycle model calibrated to the U.S. economy. We find that fiscal volatility shocks have an adverse effect on economic activity that is comparable to the effects of a 25-basis-point innovation in the federal funds rate. Download Paper
In this paper we report the results of the estimation of a rich dynamic stochastic general equilibrium (DSGE) model of the U.S. economy with both stochastic volatility and parameter drifting in the Taylor rule. We use the results of this estimation to examine the recent monetary history of the U.S. and to interpret, through this lens, the sources of the rise and fall of the great American inflation from the late 1960s to the early 1980s and of the great moderation of business cycle fluctuations between 1984 and 2007. Our main findings are that while there is strong evidence of changes in monetary policy during Volcker’s tenure at the Fed, those changes contributed little to the great moderation. Instead, changes in the volatility of structural shocks account for most of it. Also, while we find that monetary policy was different under Volcker, we do not find much evidence of a big difference in monetary policy among Burns, Miller, and Greenspan. The difference in aggregate outcomes across these periods is attributed to the time-varying volatility of shocks. The history for inflation is more nuanced, as a more vigorous stand against it would have reduced inflation in the 1970s, but not completely eliminated it. In addition, we find that volatile shocks (especially those related to aggregate demand) were important contributors to the great American inflation. Download Paper
This paper compares the role of stochastic volatility versus changes in monetary policy rules in accounting for the time-varying volatility of U.S. aggregate data. Of special interest to us is understanding the sources of the great moderation of business cycle fluctuations that the U.S. economy experienced between 1984 and 2007. To explore this issue, we build a medium-scale dynamic stochastic general equilibrium (DSGE) model with both stochastic volatility and parameter drifting in the Taylor rule and we estimate it non-linearly using U.S. data and Bayesian methods. Methodologically, we show how to confront such a rich model with the data by exploiting the structure of the high-order approximation to the decision rules that characterize the equilibrium of the economy. Our main empirical findings are: 1) even after controlling for stochastic volatility (and there is a fair amount of it), there is overwhelming evidence of changes in monetary policy during the analyzed period; 2) however, these changes in monetary policy mattered little for the great moderation; 3) most of the great performance of the U.S. economy during the 1990s was a result of good shocks; and 4) the response of monetary policy to inflation under Burns, Miller, and Greenspan was similar, while it was much higher under Volcker. Download Paper
This paper shows how to build algorithms that use graphics processing units (GPUs) installed in most modern computers to solve dynamic equilibrium models in economics. In particular, we rely on the compute uni.ed device architecture (CUDA) of NVIDIA GPUs. We illustrate the power of the approach by solving a simple real business cycle model with value function iteration. We document improvements in speed of around 200 times and suggest that even further gains are likely. Download Paper
We solve a dynamic stochastic general equilibrium (DSGE) model in which the representative household has Epstein and Zin recursive preferences. The parameters governing preferences and technology are estimated by means of maximum likelihood using macroeconomic data and asset prices, with a particular focus on the term structure of interest rates. We estimate a large risk aversion, an elasticity of intertemporal substitution higher than one, and substantial adjustment costs. Furthermore, we identify the tensions within the model by estimating it on subsets of these data. We conclude by pointing out potential extensions that might improve the model’s fit. Download Paper
This paper compares different solution methods for computing the equilibrium of dynamic stochastic general equilibrium (DSGE) models with recursive preferences such as those in Epstein and Zin (1989 and 1991). Models with these preferences have recently become popular, but we know little about the best ways to implement them numerically. To fill this gap, we solve the stochastic neoclassical growth model with recursive preferences using four different approaches: second and third-order perturbation, Chebyshev polynomials, and value function iteration. We document the performance of the methods in terms of computing time, implementation complexity, and accuracy. Our main finding is that a third-order perturbation is competitive in terms of accuracy with Chebyshev polynomials and value function iteration, while being an order of magnitude faster to run. Therefore, we conclude that perturbation methods are an attractive approach for computing this class of problems. Download Paper
In this paper, we provide a brief introduction to a new macroeconometric model of the Spanish economy named MEDEA (Modelo de Equilibrio Dinámicode la Economía EspañolA). MEDEA is a dynamic stochastic general equilibrium (DSGE) model that aims to describe the main features of the Spanish economy for policy analysis, counterfactual exercises, and forecasting. MEDEA is built in the tradition of New Keynesian models with real and nominal rigidities, but it also incorporates aspects such as a small open economy framework, an outside monetary authority such as the ECB, and population growth, factors that are important in accounting for aggregate fluctuations in Spain. The model is estimated with Bayesian techniques and data from the last two decades. Beyond describing the properties of the model, we perform different exercises to illustrate the potential of MEDEA, including historical decompositions, long-run and short-run simulations, and counterfactual experiments. Download Paper
This paper shows how changes in the volatility of the real interest rate at which small open emerging economies borrow have a quantitatively important effect on real variables like output, consumption, investment, and hours worked. To motivate our investigation, we document the strong evidence of time-varying volatility in the real interest rates faced by a sample of four emerging small open economies: Argentina, Ecuador, Venezuela, and Brazil. We postulate a stochastic volatility process for real interest rates using T-bill rates and country spreads and estimate it with the help of the Particle filter and Bayesian methods. Then, we feed the estimated stochastic volatility process for real interest rates in an otherwise standard small open economy business cycle model. We calibrate eight versions of our model to match basic aggregate observations, two versions for each of the four countries in our sample. We find that an increase in real interest rate volatility triggers a fall in output, consumption, investment, and hours worked, and a notable change in the current account of the economy. Download Paper
The dynamics of a linear (or linearized) dynamic stochastic economic model can be expressed in terms of matrices (A,B,C,D) that define a state space system. An associated state space system (A,K,C, Sigma) determines a vector autoregression for observables available to an econometrician. We review circumstances under which the impulse response of the VAR resembles the impulse response associated with the economic model. We give four examples that illustrate a simple condition for checking whether the mapping from VAR shocks to economic shocks is invertible. The condition applies when there are equal numbers of VAR and economic shocks. Download Paper
This paper studies the econometrics of computed dynamic models. Since these models generally lack a closed-form solution, economists approximate the policy functions of the agents in the model with numerical methods. But this implies that, instead of the exact likelihood function, the researcher can evaluate only an approximated likelihood associated with the approximated policy function. What are the consequences for inference of the use of approximated likelihoods? First, we show that as the approximated policy function converges to the exact policy, the approximated likelihood also converges to the exact likelihood. Second, we prove that the approximated likelihood converges at the same rate as the approximated policy function. Third, we find that the error in the approximated likelihood gets compounded with the size of the sample. Fourth, we discuss convergence of Bayesian and classical estimates. We complete the paper with three applications to document the quantitative importance of our results. Download Paper
This paper compares two methods for undertaking likelihood-based inference in dynamic equilibrium economies: a Sequential Monte Carlo filter proposed by Fernandez-Villaverde and Rubio-Rami­rez (2004) and the Kalman filter. The Sequential Monte Carlo filter exploits the nonlinear structure of the economy and evaluates the likelihood function of the model by simulation methods. The Kalman filter estimates a linearization of the economy around the steady state. We report two main results. First, both for simulated and for real data, the Sequential Monte Carlo filter delivers a substantially better fit of the model to the data as measured by the marginal likelihood. This is true even for a nearly linear case. Second, the differences in terms of point estimates, even if relatively small in absolute values, have important effects on the moments of the model. We conclude that the nonlinear filter is a superior procedure for taking models to the data. Download Paper
This paper compares solution methods for dynamic equilibrium economies. We compute and simulate the stochastic neoclassical growth model with leisure choice using Undetermined Coefficients in levels and in logs, Finite Elements, Chebyshev Polynomials, Second and Fifth Order Perturbations and Value Function Iteration for several calibrations. We document the performance of the methods in terms of computing time, implementation complexity and accuracy and we present some conclusions about our preferred approaches based on the reported evidence. Download Paper
This paper presents some new results on the solution of the stochastic neoclassical growth model with leisure. We use the method of Judd (2003) to explore how to change variables in the computed policy functions that characterize the behavior of the economy. We find a simple close-form relation between the parameters of the linear and the loglinear solution of the model. We extend this approach to a general class of changes of variables and show how to find the optimal transformation. We report how in this way we reduce the average absolute Euler equation errors of the solution of the model by a factor of three. We also demonstrate how changes of variables correct for variations in the volatility of the economy even if we work with first order policy functions and how we can keep a linear representation of the laws of motion of the model if we use a nearly optimal transformation. We finish discussing how to apply our results to estimate dynamic equilibrium economies. Download Paper
This paper presents a framework to undertake likelihood-based inference in nonlinear dynamic equilibrium economies. We develop a Sequential Monte Carlo algorithm that delivers an estimate of the likelihood function of the model using simulation methods. This likelihood can be used for parameter estimation and for model comparison. The algorithm can deal both with nonlinearities of the economy and with the presence of non- normal shocks. We show consistency of the estimate and its good performance in finite simulations. This new algorithm is important because the existing empirical literature that wanted to follow a likelihood approach was limited to the estimation of linear models with Gaussian innovations. We apply our procedure to estimate the structural parameters of the neoclassical growth model. Download Paper