Paper # Author Title
We argue that political distribution risk is an important driver of aggregate fluctuations. To that end, we document signifucant changes in the capital share after large political events, such as political realignments, modifications in collective bargaining rules, or the end of dictatorships, in a sample of developed and emerging economies. These policy changes are associated with significant fluctuations in output and asset prices. Using a Bayesian proxy-VAR estimated with U.S. data, we show how distribution shocks cause movements in output, unemployment, and sectoral asset prices. To quantify the importance of these political shocks for the U.S. as a whole, we extend an otherwise standard neoclassical growth model. We model political shocks as exogenous changes in the bargaining power of workers in a labor market with search and matching. We calibrate the model to the U.S. corporate non-financial business sector and we back up the evolution of the bargaining power of workers over time using a new methodological approach, the partial filter. We show how the estimated shocks agree with the historical narrative evidence. We document that bargaining shocks account for 34% of aggregate fluctuations. Download Paper
This paper argues that institutions and political party systems are simultaneously determined. A large change to the institutional framework, such as the creation of the euro by a group of European countries, will realign -after a transition period- the party system as well. The new political landscape may not be compatible with the institutions that triggered it. To illustrate this point, we study the case of the euro and how the party system has evolved in Southern and Northern European countries in response to it. Download Paper
A safe asset’s real value is insulated from shocks, including declines in GDP from rare macroeconomic disasters. However, in a Lucas-tree world, the aggregate risk is given by the process for GDP and cannot be altered by the creation of safe assets. Therefore, in the equilibrium of a representative-agent version of this economy, the quantity of safe assets will be nil. With heterogeneity in coefficients of relative risk aversion, safe assets can take the form of private bond issues from low-risk-aversion to high-risk-aversion agents. The model assumes Epstein-Zin/Weil preferences with common values of the intertemporal elasticity of substitution and the rate of time preference. The model achieves stationarity by allowing for random shifts in coefficients of relative risk aversion. We derive the equilibrium values of the ratio of safe to total assets, the shares of each agent in equity ownership and wealth, and the rates of return on safe and risky assets. In a baseline case, the steady-state risk-free rate is 1.0% per year, the unlevered equity premium is 4.2%, and the quantity of safe assets ranges up to 15% of economy-wide assets (comprising the capitalized value of GDP). A disaster shock leads to an extended period in which the share of wealth held by the low-risk-averse agent and the risk-free rate are low but rising, and the ratio of safe to total assets is high but falling. In the baseline model, Ricardian Equivalence holds in that added government bonds have no effect on rates of return and the net quantity of safe assets. Surprisingly, the crowding-out coefficient for private bonds with respect to public bonds is not 0 or -1 but around -0.5, a value found in some existing empirical studies. Download Paper
What do we know about the economic consequences of labor market regulations? Few economic policy questions are as contentious as labor market regulations. The effects of minimum wages, collective bargaining provisions, and hiring/firing restrictions generate heated debates in the U.S. and other advanced economies. And yet, establishing empirical lessons about the consequences of these regulations is surprisingly difficult. In this paper, I explain some of the reasons why this is the case, and I critically review the recent findings regarding the effects of minimum wages on employment. Contrary to often asserted statements, the preponderance of the evidence still points toward a negative impact of permanently high minimum wages. Download Paper
Can competition among privately issued fiat currencies such as Bitcoin or Ethereum work? Only sometimes. To show this, we build a model of competition among privately issued fiat currencies. We modify the current workhorse of monetary economics, the Lagos-Wright environment, by including entrepreneurs who can issue their own fiat currencies in order to maximize their utility. Otherwise, the model is standard. We show that there exists an equilibrium in which price stability is consistent with competing private monies, but also that there exists a continuum of equilibrium trajectories with the property that the value of private currencies monotonically converges to zero. These latter equilibria disappear, however, when we introduce productive capital. We also investigate the properties of hybrid monetary arrangements with private and government monies, of automata issuing money, and the role of network effects. Download Paper
This chapter provides an overview of solution and estimation techniques for dynamic stochastic general equilibrium (DSGE) models. We cover the foundations of numerical approximation techniques as well as statistical inference and survey the latest developments in the field. Download Paper
This paper surveys the legal tradition that links Magna Carta with the modern concepts of the rule of law and the limits on government. It documents that the original understanding of the rule of law included substantive commitments to individual freedom and limited government. Then, it attempts at explaining how and why such commitments were lost to a formalist interpretation of the rule of law from 1848 to 1939. The paper concludes by arguing how a revival of the substantive commitments of the rule of law is central in a project of reshaping modern states. Download Paper
We propose a novel method to estimate dynamic equilibrium models with stochastic volatility. First, we characterize the properties of the solution to this class of models. Second, we take advantage of the results about the structure of the solution to build a sequential Monte Carlo algorithm to evaluate the likelihood function of the model. The approach, which exploits the profusion of shocks in stochastic volatility models, is versatile and computationally tractable even in large-scale models, such as those often employed by policy-making institutions. As an application, we use our algorithm and Bayesian methods to estimate a business cycle model of the U.S. economy with both stochastic volatility and parameter drifting in monetary policy. Our application shows the importance of stochastic volatility in accounting for the dynamics of the data. Download Paper
Chamley (1986) and Judd (1985) showed that, in a standard neoclassical growth model with capital accumulation and infinitely lived agents, either taxing or subsidizing capital cannot be optimal in the steady state. In this paper, we introduce innovation-led growth into the Chamley-Judd framework, using a Schumpeterian growth model where productivity-enhancing innovations result from pro.t-motivated R&D investment. Our main result is that, for a given required trend of public expenditure, a zero tax/subsidy on capital becomes suboptimal. In particular, the higher the level of public expenditure and the income elasticity of labor supply, the less should capital income be subsidized and the more it should be taxed. Not taxing capital implies that labor must be taxed at a higher rate. This in turn has a detrimental effect on labor supply and therefore on the market size for innovation. At the same time, for a given labor supply, taxing capital also reduces innovation incentives, so that for low levels of public expenditure and/or labor supply elasticity it becomes optimal to subsidize capital income. Download Paper
We study the effects of changes in uncertainty about future fiscal policy on aggregate economic activity. Fiscal deficits and public debt have risen sharply in the wake of the financial crisis. While these developments make fisscal consolidation inevitable, there is considerable uncertainty about the policy mix and timing of such budgetary adjustment. To evaluate the consequences of this increased uncertainty, we first estimate tax and spending processes for the U.S. that allow for time-varying volatility. We then feed these processes into an otherwise standard New Keynesian business cycle model calibrated to the U.S. economy. We find that fiscal volatility shocks have an adverse effect on economic activity that is comparable to the effects of a 25-basis-point innovation in the federal funds rate. Download Paper
In this paper we report the results of the estimation of a rich dynamic stochastic general equilibrium (DSGE) model of the U.S. economy with both stochastic volatility and parameter drifting in the Taylor rule. We use the results of this estimation to examine the recent monetary history of the U.S. and to interpret, through this lens, the sources of the rise and fall of the great American inflation from the late 1960s to the early 1980s and of the great moderation of business cycle fluctuations between 1984 and 2007. Our main findings are that while there is strong evidence of changes in monetary policy during Volcker’s tenure at the Fed, those changes contributed little to the great moderation. Instead, changes in the volatility of structural shocks account for most of it. Also, while we find that monetary policy was different under Volcker, we do not find much evidence of a big difference in monetary policy among Burns, Miller, and Greenspan. The difference in aggregate outcomes across these periods is attributed to the time-varying volatility of shocks. The history for inflation is more nuanced, as a more vigorous stand against it would have reduced inflation in the 1970s, but not completely eliminated it. In addition, we find that volatile shocks (especially those related to aggregate demand) were important contributors to the great American inflation. Download Paper
This paper compares the role of stochastic volatility versus changes in monetary policy rules in accounting for the time-varying volatility of U.S. aggregate data. Of special interest to us is understanding the sources of the great moderation of business cycle fluctuations that the U.S. economy experienced between 1984 and 2007. To explore this issue, we build a medium-scale dynamic stochastic general equilibrium (DSGE) model with both stochastic volatility and parameter drifting in the Taylor rule and we estimate it non-linearly using U.S. data and Bayesian methods. Methodologically, we show how to confront such a rich model with the data by exploiting the structure of the high-order approximation to the decision rules that characterize the equilibrium of the economy. Our main empirical findings are: 1) even after controlling for stochastic volatility (and there is a fair amount of it), there is overwhelming evidence of changes in monetary policy during the analyzed period; 2) however, these changes in monetary policy mattered little for the great moderation; 3) most of the great performance of the U.S. economy during the 1990s was a result of good shocks; and 4) the response of monetary policy to inflation under Burns, Miller, and Greenspan was similar, while it was much higher under Volcker. Download Paper
This paper shows how to build algorithms that use graphics processing units (GPUs) installed in most modern computers to solve dynamic equilibrium models in economics. In particular, we rely on the compute uni.ed device architecture (CUDA) of NVIDIA GPUs. We illustrate the power of the approach by solving a simple real business cycle model with value function iteration. We document improvements in speed of around 200 times and suggest that even further gains are likely. Download Paper
We solve a dynamic stochastic general equilibrium (DSGE) model in which the representative household has Epstein and Zin recursive preferences. The parameters governing preferences and technology are estimated by means of maximum likelihood using macroeconomic data and asset prices, with a particular focus on the term structure of interest rates. We estimate a large risk aversion, an elasticity of intertemporal substitution higher than one, and substantial adjustment costs. Furthermore, we identify the tensions within the model by estimating it on subsets of these data. We conclude by pointing out potential extensions that might improve the model’s fit. Download Paper
This paper compares different solution methods for computing the equilibrium of dynamic stochastic general equilibrium (DSGE) models with recursive preferences such as those in Epstein and Zin (1989 and 1991). Models with these preferences have recently become popular, but we know little about the best ways to implement them numerically. To fill this gap, we solve the stochastic neoclassical growth model with recursive preferences using four different approaches: second and third-order perturbation, Chebyshev polynomials, and value function iteration. We document the performance of the methods in terms of computing time, implementation complexity, and accuracy. Our main finding is that a third-order perturbation is competitive in terms of accuracy with Chebyshev polynomials and value function iteration, while being an order of magnitude faster to run. Therefore, we conclude that perturbation methods are an attractive approach for computing this class of problems. Download Paper
In this paper, we provide a brief introduction to a new macroeconometric model of the Spanish economy named MEDEA (Modelo de Equilibrio Dinámicode la Economía EspañolA). MEDEA is a dynamic stochastic general equilibrium (DSGE) model that aims to describe the main features of the Spanish economy for policy analysis, counterfactual exercises, and forecasting. MEDEA is built in the tradition of New Keynesian models with real and nominal rigidities, but it also incorporates aspects such as a small open economy framework, an outside monetary authority such as the ECB, and population growth, factors that are important in accounting for aggregate fluctuations in Spain. The model is estimated with Bayesian techniques and data from the last two decades. Beyond describing the properties of the model, we perform different exercises to illustrate the potential of MEDEA, including historical decompositions, long-run and short-run simulations, and counterfactual experiments. Download Paper
This paper shows how changes in the volatility of the real interest rate at which small open emerging economies borrow have a quantitatively important effect on real variables like output, consumption, investment, and hours worked. To motivate our investigation, we document the strong evidence of time-varying volatility in the real interest rates faced by a sample of four emerging small open economies: Argentina, Ecuador, Venezuela, and Brazil. We postulate a stochastic volatility process for real interest rates using T-bill rates and country spreads and estimate it with the help of the Particle filter and Bayesian methods. Then, we feed the estimated stochastic volatility process for real interest rates in an otherwise standard small open economy business cycle model. We calibrate eight versions of our model to match basic aggregate observations, two versions for each of the four countries in our sample. We find that an increase in real interest rate volatility triggers a fall in output, consumption, investment, and hours worked, and a notable change in the current account of the economy. Download Paper
In this paper, I review the literature on the formulation and estimation of dynamic stochastic general equilibrium (DSGE) models with a special emphasis on Bayesian methods. First, I discuss the evolution of DSGE models over the last couple of decades. Second, I explain why the profession has decided to estimate these models using Bayesian methods. Third, I briefly introduce some of the techniques required to compute and estimate these models. Fourth, I illustrate the techniques under consideration by estimating a benchmark DSGE model with real and nominal rigidities. I conclude by offering some pointers for future research. Download Paper
The dynamics of a linear (or linearized) dynamic stochastic economic model can be expressed in terms of matrices (A,B,C,D) that define a state space system. An associated state space system (A,K,C, Sigma) determines a vector autoregression for observables available to an econometrician. We review circumstances under which the impulse response of the VAR resembles the impulse response associated with the economic model. We give four examples that illustrate a simple condition for checking whether the mapping from VAR shocks to economic shocks is invertible. The condition applies when there are equal numbers of VAR and economic shocks. Download Paper
This paper studies the econometrics of computed dynamic models. Since these models generally lack a closed-form solution, economists approximate the policy functions of the agents in the model with numerical methods. But this implies that, instead of the exact likelihood function, the researcher can evaluate only an approximated likelihood associated with the approximated policy function. What are the consequences for inference of the use of approximated likelihoods? First, we show that as the approximated policy function converges to the exact policy, the approximated likelihood also converges to the exact likelihood. Second, we prove that the approximated likelihood converges at the same rate as the approximated policy function. Third, we find that the error in the approximated likelihood gets compounded with the size of the sample. Fourth, we discuss convergence of Bayesian and classical estimates. We complete the paper with three applications to document the quantitative importance of our results. Download Paper
Job security provisions are commonly invoked to explain the high and persistent European unemployment rates. This belief has led several countries to reform their labor markets and liberalize the use of fixed-term contracts. Despite how common such contracts have become after deregulation, there is a lack of quantitative analysis of their impact on the economy. To fill this gap, we build a general equilibrium model with heterogeneous agents and firing costs in the tradition of Hopenhayn and Rogerson (1993). We calibrate our model to Spanish data, choosing in part parameters estimated with firm-level longitudinal data. Spain is particularly interesting, since its labor regulations are among the most protective in the OECD, and both its unemployment and its share of fixed-term employment are the highest. We find that fixed term contracts increase unemployment, reduce output, and raise productivity. The welfare effects are ambiguous. Download Paper
This paper compares two methods for undertaking likelihood-based inference in dynamic equilibrium economies: a Sequential Monte Carlo filter proposed by Fernandez-Villaverde and Rubio-Rami­rez (2004) and the Kalman filter. The Sequential Monte Carlo filter exploits the nonlinear structure of the economy and evaluates the likelihood function of the model by simulation methods. The Kalman filter estimates a linearization of the economy around the steady state. We report two main results. First, both for simulated and for real data, the Sequential Monte Carlo filter delivers a substantially better fit of the model to the data as measured by the marginal likelihood. This is true even for a nearly linear case. Second, the differences in terms of point estimates, even if relatively small in absolute values, have important effects on the moments of the model. We conclude that the nonlinear filter is a superior procedure for taking models to the data. Download Paper
This paper compares solution methods for dynamic equilibrium economies. We compute and simulate the stochastic neoclassical growth model with leisure choice using Undetermined Coefficients in levels and in logs, Finite Elements, Chebyshev Polynomials, Second and Fifth Order Perturbations and Value Function Iteration for several calibrations. We document the performance of the methods in terms of computing time, implementation complexity and accuracy and we present some conclusions about our preferred approaches based on the reported evidence. Download Paper
This paper presents some new results on the solution of the stochastic neoclassical growth model with leisure. We use the method of Judd (2003) to explore how to change variables in the computed policy functions that characterize the behavior of the economy. We find a simple close-form relation between the parameters of the linear and the loglinear solution of the model. We extend this approach to a general class of changes of variables and show how to find the optimal transformation. We report how in this way we reduce the average absolute Euler equation errors of the solution of the model by a factor of three. We also demonstrate how changes of variables correct for variations in the volatility of the economy even if we work with first order policy functions and how we can keep a linear representation of the laws of motion of the model if we use a nearly optimal transformation. We finish discussing how to apply our results to estimate dynamic equilibrium economies. Download Paper
This paper presents a framework to undertake likelihood-based inference in nonlinear dynamic equilibrium economies. We develop a Sequential Monte Carlo algorithm that delivers an estimate of the likelihood function of the model using simulation methods. This likelihood can be used for parameter estimation and for model comparison. The algorithm can deal both with nonlinearities of the economy and with the presence of non- normal shocks. We show consistency of the estimate and its good performance in finite simulations. This new algorithm is important because the existing empirical literature that wanted to follow a likelihood approach was limited to the estimation of linear models with Gaussian innovations. We apply our procedure to estimate the structural parameters of the neoclassical growth model. Download Paper
This paper uses a seminonparametric model and Consumer Expenditure Survey data to estimate life cycle profiles of consumption, controlling for demographics, cohort and time effects. In addition to documenting profiles for total and nondurable consumption, we devote special attention to the age expenditure pattern for consumer durables. We find hump-shaped paths over the life cycle for total, for nondurable and for durable expenditures. Changes in household size account for roughly half of these humps. The other half remains unaccounted for by the standard complete markets life cycle model. Our results imply that households do not smooth consumption over their lifetimes. This is especially true for services from consumer durables. Bootstrap simulations suggest that our empirical estimates are tight and sensitivity analysis indicates that the computed profiles are robust to a large number of different specifications. Download Paper
This paper proposes a new, more robust, experiment to test for the presence of hyperbolic discounting. Recently, a growing literature has studied intertemporal choice when individuals discount the future hyperbolically. These preferences generate dynamically inconsistent choices, in contrast with the usual assumption of exponential discounting, where this issue cannot arise. Hyperbolic discounting is justified based on experimental evidence of individual self-control problems. We argue that this interpretation depends crucially on the absence of uncertainty. We show that, once uncertainty is included, the observed behavior is compatible with exponential discounting. We then test for the presence of hyperbolic discounting in a new experiment that controls for uncertainty. The experiment offers two choice sets, the second being a strict subset of the first. Exponential discounters will (possibly weakly) prefer the largest one. Hyperbolic discounters, in contrast, will (strictly) prefer the second set because its design makes it equivalent to a commitment technology. The experiment is conducted on a sample of undergraduate students. Our results suggest that hyperbolic behavior is more difficult to find than implied by previous experiments. Download Paper
This paper studies the relationship between population dynamics and economic growth. Prior to the Industrial Revolution increases in total output were roughly matched by increases in population. In contrast, during the last 150 years, increments in per capita income have coexisted with slow population growth. Why are income and population growth no longer positively correlated? This paper presents a new answer, based on the role of capital-specific technological change, that provides a unifying account of lower population growth and sustained economic growth. An overlapping generations model with capital skill, complementarity and endogenous fertility, mortality and education is constructed and parameterized to match English data from 1536 to 1920. The key finding is that the observed fall in the relative price of capital can account for more than 60% of the fall ill fertility mid over 50 of the increase in income per capita, in England occurred during the demographic transition. Additional experiments show that neutral technological change or the reduction in mortality cannot account for the fall in fertility. Download Paper
This paper studies the properties of the Bayesian approach to estimation and comparison of dynamic equilibrium economies. Both tasks can be performed even if the models are nonnested, misspecified and nonlinear. First, we show that Bayesian methods have a classical interpretation: asymptotically the parameter point estimates converge to their pseudotrue values and the best model under the Kullback-Leibler distance will have the highest posterior probability. Second, we illustrate the strong small sample behavior of the approach using a well-known application: the U.S. cattle cycle. Bayesian estimates outperform Maximum Likelihood results and the proposed model is easily compared with a set of BVARs. Download Paper