Working Papers
Paper #  Author  Title  

17005 
Hanming Fang Zenan Wu 
Life Insurance and Life Settlement Markets with Overconfident Policyholders  
We analyze how the life settlement market  the secondary market for life insurance  may affect consumer welfare in a dynamic equilibrium model of life insurance with onesided commitment and overconfident policyholders. As in Daily et al. (2008) and Fang and Kung (2010), policyholders may lapse their life insurance policies when they lose their bequest motives; but in our model the policyholders may underestimate their probability of losing their bequest motive, or be overconfident about their future mortality risks. For the case of overconfidence with respect to bequest motives, we show that in the absence of life settlement overconfident consumers may buy too much" reclassiffication risk insurance for later periods in the competitive equilibrium. In contrast, when consumers are overconfident about their future mortality rates in the sense that they put too high a subjective probability on the lowmortality state, the competitive equilibrium contract in the absence of life settlement exploits the consumer bias by offering them very high face amounts only in the lowmortality state. In both cases, life settlement market can impose a discipline on the extent to which overconfident consumers can be exploited by the primary insurers. We show that life settlement may increase the equilibrium consumer welfare of overconfident consumers when they are sufficiently vulnerable in the sense that they have a sufficiently large intertemporal elasticity of substitution of consumption. Download Paper


17004 
Audrey Hu Steven A. Matthews Liang Zou 
Low Reserve Prices in Auctions  
Received auction theory prescribes that a reserve price which maximizes expected profit should be no less than the seller's own value for the auctioned object. In contrast, a common empirical observation is that many auctions have reserve prices set below seller's values, even at zero. This paper revisits the theory to find a potential resolution of the puzzle for secondprice auctions. The main result is that an optimal reserve price may be less than the seller's value if bidders are risk averse and have interdependent values. Moreover, the resulting outcome may be arbitrarily close to that of an auction that has no reserve price, an absolute auction. Download Paper


17003 
Francis X. Diebold Laura Liu Kamil Yilmaz 
Commodity Connectedness  
We use variance decompositions from highdimensional vector autoregressions to characterize connectedness in 19 key commodity return volatilities, 20112016. We study both static (fullsample) and dynamic (rollingsample) connectedness. We summarize and visualize the results using tools from network analysis. The results reveal clear clustering of commodities into groups that match traditional industry groupings, but with some notable differences. The energy sector is most important in terms of sending shocks to others, and energy, industrial metals, and precious metals are themselves tightly connected. Download Paper


17002 
Julio A. Carrillo Enrique G. Mendoza Victoria Nuguer Jessica RoldanPena 
Tight MoneyTight Credit: Coordination Failure in the Conduct of Monetary and Financial Policies  
Quantitative analysis of a New Keynesian model with the BernankeGertler accelerator and
risk shocks shows that violations of Tinbergen’s Rule and strategic interaction between policymaking authorities undermine significantly the effectiveness of monetary and financial policies.
Separate monetary and financial policy rules, with the latter subsidizing lenders to encourage
lending when credit spreads rise, produce higher welfare and smoother business cycles
than a monetary rule augmented with credit spreads. The latter yields a tight moneytight credit
regime in which the interest rate responds too much to inflation and not enough to adverse
credit conditions. Reaction curves for the choice of policyrule elasticity that minimizes each
authority’s loss function given the other authority’s elasticity are nonlinear, reflecting shifts
from strategic substitutes to complements in setting policyrule parameters. The Nash equilibrium
is significantly inferior to the Cooperative equilibrium, both are inferior to a firstbest
outcome that maximizes welfare, and both produce tight moneytight credit regimes. Download Paper


17001 
Yunan Li 
Mechanism Design with Financially Constrained Agents and Costly Verification*  
A principal wishes to distribute an indivisible good to a population of budgetconstrained agents. Both valuation and budget are an agent’s private information. The principal can inspect an agent’s budget through a costly verification process and punish an agent who makes a
false statement. I characterize the direct surplusmaximizing mechanism. This direct mechanism can be implemented by a twostage mechanism in which agents only report their budgets. Specifically, all agents report their budgets in the first stage. The principal then provides budget dependent cash subsidies to agents and assigns the goods randomly (with uniform probability) at budgetdependent prices. In the second stage, a resale market opens, but is regulated with budgetdependent sales taxes. Agents who report low budgets receive more subsidies in their initial purchases (the first stage), face higher taxes in the resale market (the second stage) and are inspected randomly. This implementation exhibits some of the features of some welfare programs, such as Singapore’s housing and development board. Download Paper


16027 
Jesus FernandezVillaverde 
The Economic Consequences of Labor Market Regulations  
What do we know about the economic consequences of labor market regulations? Few economic policy questions are as contentious as labor market regulations. The effects of minimum wages, collective bargaining provisions, and hiring/firing restrictions generate heated debates in the U.S. and other advanced economies. And yet, establishing empirical lessons about the consequences of these regulations is surprisingly difficult. In this paper, I explain some of the reasons why this is the case, and I critically review the recent findings regarding the effects of minimum wages on employment. Contrary to often asserted statements, the preponderance of the evidence still points toward a
negative impact of permanently high minimum wages. Download Paper


16026 
Maryam Farboodi Gregor Jarosch Guido Menzio 
Intermediation as Rent Extraction  
This paper develops a theory of asset intermediation as a pure rent extraction
activity. Agents meet bilaterally in a random fashion. Agents differ with respect to
their valuation of the asset's dividends and with respect to their ability to commit
to takeitorleaveit offers. In equilibrium, agents with commitment behave as
intermediaries, while agents without commitment behave as end users. Agents with
commitment intermediate the asset market only because they can extract more of
the gains from trade when reselling or repurchasing the asset. We study the extent
of intermediation as a rent extraction activity by examining the agent's decision
to invest in a technology that gives them commitment. We find that multiple
equilibria may emerge, with different levels of intermediation and with lower welfare in equilibria with more intermediation. We find that a decline in trading frictions leads to more intermediation and typically lower welfare, and so does a decline in the opportunity cost of acquiring commitment. A transaction tax can restore efficiency. Download Paper


16025 
Sarah Baird Aislinn Bohren Craig McIntosh Berk Ozler 
Optimal Design of Experiments in the Presence of Interference*  
This paper formalizes the optimal design of randomized controlled trials (RCTs) in the presence of interference between units, where an individual's outcome depends on the behavior and outcomes of others in her group. We focus on randomized saturation (RS) designs, which are twostage RCTs that first randomize the treatment saturation of a group, then randomize individual treatment assignment. Our main contributions are to map the potential outcomes framework with partial interference to a regression model with clustered errors, calculate the statistical power of different RS designs, and derive analytical insights for how to optimally design an RS experiment. We show that the power to detect average treatment effects declines precisely with the ability to identify novel treatment and spillover estimands, such as how effects vary with the intensity of treatment. We provide software that assists researchers in designing RS experiments. Download Paper


16024 
Aislinn Bohren 
Using Persistence to Generate Incentives in a Dynamic Moral Hazard Problem  
This paper studies how persistence can be used to create incentives in a
continuoustime stochastic game in which a longrun player interacts with a se
quence of shortrun players. Observation of the longrun player's actions are
distorted by a Brownian motion and the actions of both players impact future
payoffs through a state variable. For example, a firm or worker provides customers with a product, and the quality of this product depends on both current
and past investment choices by the firm. I derive general conditions under which
a Markov equilibrium emerges as the unique perfect public equilibrium, and char
acterize the equilibrium payoff and actions in this equilibrium, for any discount
rate. I develop an application of persistent product quality to illustrate how per
sistence creates effective intertemporal incentives in a setting where traditional
channels fail, and explore how the structure of persistence impacts equilibrium
behavior. This demonstrates the power of the continuoustime setting to deliver
sharp insights and a tractable equilibrium characterization for a rich class of
dynamic games. Download Paper


16023 
Aislinn Bohren Troy Kravitz 
Optimal Contracting with Costly State Verification, with an Application to Crowdsourcing  
A firm employs workers to obtain costly unverifiable information  for example, categorizing the content of images. Workers are monitored by comparing their messages. The optimal contract under limited liability exhibits three key features: (i) the monitoring technology depends crucially on the commitment power of the firm  virtual monitoring, or monitoring with arbitrarily small probability, is optimal when the firm can commit to truthfully reveal messages from other workers, while monitoring with strictly positive probability is optimal when the firm can hide messages (partial commitment), (ii) bundling multiple tasks reduces worker rents
and monitoring inefficiencies; and (iii) the optimal contract is approximately efficient under full but not partial commitment. We conclude with an application to crowdsourcing platforms, and characterize the optimal contract for tasks found on these platforms. Download Paper


16022 
Laura Liu Hyungsik Roger Moon Frank Schorfheide 
Forecasting with Dynamic Panel Data Models  
This paper considers the problem of forecasting a collection of short time series
using cross sectional information in panel data. We construct point predictors using
Tweedie's formula for the posterior mean of heterogeneous coefficients under a correlated random effects distribution. This formula utilizes crosssectional information to transform the unitspecific (quasi) maximum likelihood estimator into an approximation of the posterior mean under a prior distribution that equals the population distribution of the random coefficients. We show that the risk of a predictor based on a nonparametric estimate of the Tweedie correction is asymptotically equivalent to the risk of a predictor that treats the correlatedrandomeffects distribution as known (ratiooptimality). Our empirical Bayes predictor performs well compared to various competitors in a Monte Carlo study. In an empirical application we use the predictor to forecast revenues for a large panel of bank holding companies and compare forecasts that condition on actual and severely adverse macroeconomic conditions. Download Paper


16021 
R. Jason Faberman Guido Menzio 
Evidence on the Relationship between Recruiting and the Starting Wage  
Using data from the Employment Opportunity Pilot Project, we examine the relationship between the starting wage paid to the worker filling a vacancy, the number of applications attracted by the vacancy, the number of candidates interviewed for the vacancy, and the duration of the vacancy. We find that the wage is positively related to the duration of a vacancy and negatively related to the number of applications and interviews per week. We show that these surprising findings are consistent with a view of the labor market in which firms post wages and workers direct their search based on these wages if workers and jobs are heterogeneous and the interaction between worker’s type and job’s type in production satisfies some rather natural assumptions. Download Paper


16020 
Enrique G. Mendoza 
Macroprudential Policy: Promise and Challenges  
Macroprudential policy holds the promise of becoming a powerful tool for preventing financial crises. Financial amplification in response to domestic shocks or global spillovers and pecuniary externalities caused by Fisherian collateral constraints provide a sound theoretical foundation for this policy. Quantitative studies show that models with these constraints replicate key stylized facts of financial crises, and that the optimal financial policy of an ideal constrainedefficient social planner reduces sharply the magnitude and frequency of crises. Research also shows, however, that implementing effective macroprudential policy still faces serious hurdles. This paper highlights three of them: (i) complexity, because the optimal policy responds widely and nonlinearly to movements in both domestic factors and global spillovers due to regime shifts in global liquidity, news about global fundamentals, and recurrent innovation and regulatory changes in world markets, (ii) lack of credibility, because of timeinconsistency of the optimal policy under commitment, and (iii) coordination failure, because a careful balance with monetary policy is needed to avoid quantitatively large inefficiencies resulting from violations of Tinbergen’s rule or strategic interaction between monetary and financial authorities. Download Paper


16019 
Pablo D'Erasmo Enrique G. Mendoza 
Optimal Domestic (and External) Sovereign Default  
Infrequent but turbulent episodes of outright sovereign default on domestic creditors are considered a “forgotten history” in Macroeconomics. We propose a heterogeneousagents model in which optimal debt and default on domestic and foreign creditors are driven by distributional incentives and endogenous default costs due to value of debt for selfinsurance, liquidity and risksharing. The government’s aim to redistribute resources across agents and through time in response to uninsurable shocks produces a rich dynamic feedback mechanism linking debt issuance, the distribution of government bond holdings, the default decision, and risk premia. Calibrated to Spanish data, the model is consistent with key cyclical comovements and features of debtcrisis dynamics. Debt exhibits protracted fluctuations. Defaults have a low frequency of 0.93 percent, are preceded by surging debt and spreads, and occur with relatively low external debt. Default risk limits the sustainable debt and yet spreads are zero most of the time. Download Paper


16018 
George J. Mailath Stephen Morris Andrew Postlewaite 
Laws and Authority  
A law prohibiting a particular behavior does not directly change the payoff to an individual should he engage in the prohibited behavior. Rather, any change in the individual’s payoff, should he engage in the prohibited behavior, is a consequence of changes in other peoples’ behavior. If laws do not directly change payoffs, they are “cheap talk,” and can only affect behavior because people have coordinated beliefs about the effects of the law. Beginning from this point of view, we provide definitions of authority in a variety of problems, and investigate how and when individuals can have, gain, and lose authority. Download Paper


16017 
Edward Herbst Frank Schorfheide 
Tempered Particle Filtering  
The accuracy of particle filters for nonlinear statespace models crucially depends on the proposal distribution that mutates time t − 1 particle values into time t values. In the widelyused bootstrap particle filter this distribution is generated by the state transition equation. While straightforward to implement, the practical performance is often poor. We develop a selftuning particle filter in which the proposal distribution is constructed adaptively through a sequence of Monte Carlo steps. Intuitively, we start from a measurement error distribution with an inflated variance, and then gradually reduce the variance to its nominal level in a sequence of steps that we call tempering. We show that the filter generates an unbiased and consistent approximation of the likelihood function. Holding the run time fixed, our filter is substantially more accurate in two DSGE model applications than the bootstrap particle filter. Download Paper


16016 
Hanming Fang Zenan Wu 
Multidimensional Private Information, Market Structure and Insurance Markets  
A large empirical literature found that the correlation between insurance purchase and ex post realization of risk is often statistically insignificant or negative. This is inconsistent with the predictions from the classic models of insurance a la Akerlof (1970), Pauly (1974) and Rothschild and Stiglitz (1976) where consumers have onedimensional heterogeneity in their risk types. It is suggested that selection based on multidimensional private information, e.g., risk and risk preference types, may be able to explain the empirical findings. In this paper, we systematically investigate whether selection based on multidimensional private information in risk and risk preferences, can, under different market structures, result in a negative correlation in equilibrium between insurance coverage and ex post realization of risk. We show that if the insurance market is perfectly competitive, selection based on multidimensional private information does not result in negative correlation property in equilibrium, unless there is a sufficiently high loading factor. If the insurance market is monopolistic or imperfectly competitive, however, we show that it is possible to generate negative correlation property in equilibrium when risk and risk preference types are sufficiently negative dependent, a notion we formalize using the concept of copula. We also clarify the connections between some of the important concepts such as adverse/advantageous selection and positive/negative correlation property. Download Paper


16015 
David Dillenberger Collin Raymond 
GroupShift and the Consensus Effect, Second Version  
Individuals often tend to conform to the choices of others in group decisions, compared to choices made in isolation, giving rise to phenomena such as group polarization and the bandwagon effect. We show that this behavior, which we term the consensus effect, is equivalent to a wellknown violation of expected utility, namely strict quasiconvexity of preferences. In contrast to the equilibrium outcome when individuals are expected utility maximizers, quasiconvexity of preferences imply that group decisions may fail to properly aggregate preferences and strictly Paretodominated equilibria may arise. Moreover, these problems become more severe as the size of the group grows. Download Paper


16014 
Daniel N. Hauser 
Promoting a Reputation for Quality  
I consider a model in which a firm invests in both product quality and in a costly
signaling technology, and the firm's reputation is the market's belief that its quality
is high. The firm influences the rate at which consumers receive information about
quality: the firm can either promote, which increases the arrival rate of signals
when quality is high, or censor, which decreases the arrival rate of signals when
quality is low. I study how the firm's incentives to build quality and signal depend
on its reputation and current quality. The firm's ability to promote or censor plays
a key role in the structure of equilibria. Promotion and investment in quality are
complements: the firm has stronger incentives to build quality when the promotion
level is high. Costly promotion can, however, reduce the firm's incentive to build
quality; this effect persists even as the cost of building quality approaches zero.
Censorship and investment in quality are substitutes. The ability to censor can
destroy a firm's incentives to invest in quality, because it can reduce information
about poor quality products. Download Paper


16013 
Venkataraman Bhaskar George J. Mailath 
The Curse of Long Horizons  
We study dynamic moral hazard with symmetric ex ante uncertainty about the difficulty of the job. The principal and agent update their beliefs about the difficulty as they observe output. Effort is private and the principal can only offer spot contracts. The agent has an additional incentive to shirk beyond the disutility of effort when the principal induces effort: shirking results in the principal having incorrect beliefs. We show that the effort inducing contract must provide
increasingly high powered incentives as the length of the relationship increases. Thus it is never optimal to always induce effort in very long relationships. Download Paper


16012 
Yuichi Yamamoto 
Stochastic Games With Hidden States  
This paper studies infinitehorizon stochastic games in which players observe payoffs and noisy public information about a hidden state each period.We find that, very generally, the feasible and individually rational payoff set is invariant to the initial prior about the state in the limit as the discount factor goes to one. This result ensures that players can punish or reward the opponents via continuation payoffs in a flexible way. Then we prove the folk theorem, assuming that public randomization is available. The proof is
constructive, and introduces the idea of random blocks to design an effective punishment mechanism. Download Paper


16011 
David Dillenberger R. Vijay Krishna Philipp Sadowski 
Subjective Dynamics Information Constraints  
We axiomatize a new class of recursive dynamic models that capture subjective constraints on the amount of information a decision maker can obtain, pay attention to, or absorb, via a Markov Decision Process for Information Choice (MIC). An MIC is a subjective decision process that specifies what type of information about the payoffrelevant state is feasible in the current period, and how the choice of what to learn now affects what can be learned in the future. The constraint imposed by the MIC is identified from choice behavior up to a recursive extension of Blackwell dominance. All the other parameters of the model, namely the anticipated evolution of the payoffrelevant state, state dependent consumption utilities, and the discount factor are also uniquely identified. Download Paper


16009 
Yunan Li 
Mechanism Design with Costly Verification and Limited Punishments  
A principal has to allocate a good among a number of agents, each of whom values the good. Each agent has private information about the principal's payoff if he receives the good. There are no monetary transfers. The principal can inspect agents' reports at a cost and penalize them, but the punishments are limited. I characterize an optimal mechanism featuring two thresholds. Agents whose values are below the lower threshold and above the upper threshold are pooled, respectively. If the number of agents is small, then the pooling area at the top of value distribution disappears. If the number of agents is large, then the two pooling areas meet and the optimal mechanism can be implemented via a shortlisting procedure. Download Paper


16008 
Jesus FernandezVillaverde Daniel R. Sanches 
Can Currency Competition Work?  
Can competition among privately issued fiat currencies such as Bitcoin or Ethereum work? Only sometimes. To show this, we build a model of competition among privately issued fiat currencies. We modify the current workhorse of monetary economics, the LagosWright environment, by including entrepreneurs who can issue their own fiat currencies in order to maximize their utility. Otherwise, the model is standard. We show that there exists an equilibrium in which price stability is consistent with competing private monies, but also that there exists a continuum of equilibrium trajectories with the property that the value of private currencies monotonically converges to zero. These latter equilibria disappear, however, when we introduce productive capital. We also investigate the properties of hybrid monetary arrangements with private and government monies,
of automata issuing money, and the role of network effects. Download Paper


16007 
Yunan Li 
Efficient Mechanisms with Information Acquisition  
This paper studies the design of ex ante efficient mechanisms in situations where a single item is for sale, and agents have positively interdependent values and can covertly acquire information at a cost before participating in a mechanism. I find that when interdependency is low and/or the number of agents is large, the ex post efficient mechanism is also ex ante efficient. In cases of high interdependency and/or a small number of agents, ex ante efficient mechanisms discourage agents from acquiring excessive information by introducing randomization to the ex post efficient allocation rule in areas where the information’s precision increases most rapidly. Download Paper


16006 
Hanming Fang Qing Gong 
Detecting Potential Overbilling in Medicare Reimbursement via Hours Worked  
Medicare over billing refers to the phenomenon that providers report more and/or higherintensity service codes than actually delivered to receive higher Medicare reimbursement. We propose a novel and easytoimplement approach to detect potential over billing based on the hours worked implied by the service codes physicians submit to Medicare. Using the Medicare Part B FeeforService (FFS) Physician Utilization and Payment Data in 2012 and 2013 released by the Centers for Medicare and Medicaid Services (CMS), we first construct estimates for physicians' hours spent on Medicare Part B FFS beneficiaries. Despite our deliberately conservative estimation procedure, we find that about 2,300 physicians, or 3% of those with a significant fraction of Medicare Part B FFS services, have billed Medicare over 100 hours per week. We consider this implausibly long hours. As a benchmark, the maximum hours spent on Medicare patients by physicians in National Ambulatory Medical Care Survey data are 50 hours in a week. Interestingly, we also find suggestive evidence that the coding patterns of the flagged physicians seem to be responsive to financial incentives: within code clusters with
different levels of service intensity, they tend to submit more higher intensity service codes than unflagged physicians; moreover, they are more likely to do so if the marginal revenue gain from submitting mid or highintensity codes is relatively high Download Paper


16005 
David Dillenberger Collin Raymond 
GroupShift and the Consensus Effect  
It is well documented that individuals make different choices in the context of group decisions, such as elections, from choices made in isolation. In particular, individuals tend to conform to the decisions of others a property we call the consensus effect  which in turn implies phenomena such as group polarization and the bandwagon effect. We show that the consensus effect is equivalent to a wellknown violation of expected utility, namely strict quasiconvexity of preferences. Our results qualify and extend those of Eliaz, Ray and Razin (2006), who focus on choiceshifts in group when one option is safe (i.e., a degenerate lottery). In contrast to the equilibrium outcome when individuals are expected utility maximizers, the consensus effect implies that group decisions may fail to properly aggregate preferences in strategic contexts and strictly Paretodominated equilibria may arise. Moreover, these problems become more severe as the size of the group grows. Download Paper


16004 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson David Schmeidler 
Economics: Between Prediction and Criticism, Second Version  
We suggest that one way in which economic analysis is useful is by offering a critique of reasoning. According to this view, economic theory may be useful not only by providing predictions, but also by pointing out weaknesses of arguments. It is argued that, when a theory requires a nontrivial act of interpretation, its roles in producing predictions and offering critiques vary in a substantial way. We offer a formal model in which these different roles can be captured. Download Paper


16003 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson 
Memorable Consumption  
People often consume nondurable goods in a way that seems inconsistent with preferences for smoothing consumption over time. We suggest that such patterns of consumption can be better explained if one takes
into account the future utility flows generated by memorable consumption goods, such as a honeymoon or a vacation, whose utility flow outlives their physical consumption. We consider a model in which a consumer
enjoys current consumption as well as utility generated by earlier memorable consumption. Lasting utility flows are generated only by some goods, and only when their consumption exceeds customary levels by a sufficient margin.
We offer axiomatic foundations for the structure of the utility function and study optimal consumption in a dynamic model. We show that rational consumers, taking into account future utility flows, would make optimal
choices that rationalize lumpy patterns of consumption . Download Paper


16002 
Behrang KamaliShahdadi 
Sorting and Peer Effects  
The effect of sorting students based on their academic performances depends not
only on direct peer effects but also on indirect peer effects through teachers' efforts.
Standard assumptions in the literature are insufficient to determine the effect of
sorting on the performances of students and so are silent on the effect of policies such
as tracking, implementing school choice, and voucher programs. We show that the
effect of such policies depends on the curvature of teachers' marginal utility of effort.
We characterize conditions under which sorting increases (decreases) the total effort
of teachers and the average performance of students. Download Paper


16001 
Guido Menzio Leena Rudanko Nicholas Trachter 
Relative Price Dispersion: Evidence and Theory  
We use a large dataset on retail pricing to document that a sizeable portion of the crosssectional variation in the price at which the same good trades in the same period
and in the same market is due to the fact that stores that are, on average, equally expensive set persistently different prices for the same good. We refer to this phenomenon
as relative price dispersion. We argue that relative price dispersion stems from sellers’ attempts to discriminate between highvaluation buyers who need to make all of their
purchases in the same store, and lowvaluation buyers who are willing to purchase different items from different stores. We calibrate our theory and show that it is not only
consistent with the extent and sources of dispersion in the price that different sellers charge for the same good, but also with the extent and sources of dispersion in the
prices that different households pay for the same basket of goods, as well as with the relationship between prices paid and the number of stores visited by different households. Download Paper


15043 
Enrique G. Mendoza Javier Bianchi Chenxin Liu 
Fundamentals News, Global Liquidity and Macroprudential Policy*  
We study optimal macroprudential policy in a model in which unconventional shocks, in the form of news about future fundamentals and regime changes in world interest rates, interact with collateral constraints in driving the dynamics of financial crises. These shocks strengthen incentives to borrow in good times (i.e. when \good news" about future fundamentals coincide with a lowworldinterestrate regime), thereby increasing vulnerability to crises and enlarging the pecuniary externality due to the collateral constraints. Quantitatively, an optimal schedule of macroprudential debt taxes can lower the frequency and magnitude of financial crises, but the policy is complex because it features significant variation across interestrate regimes and news realizations. Download Paper


15042 
Jesus FernandezVillaverde Juan F. RubioRamírez Frank Schorfheide 
Solution and Estimation Methods for DSGE Models  
This chapter provides an overview of solution and estimation techniques for dynamic stochastic general equilibrium (DSGE) models. We cover the foundations of numerical approximation techniques as well as statistical inference and survey the latest developments in the field. Download Paper


15041 
Hanming Fang You Suk Kim Wenli Li 
The Dynamics of AdjustableRate Subprime Mortgage Default: A Structural Estimation  
We present a dynamic structural model of subprime adjustablerate mortgage (ARM) borrowers making payment decisions taking into account possible consequences of different degrees of delinquency from their lenders. We empirically implement the model using unique data sets that contain information on borrowers' mortgage payment history, their broad balance sheets, and lender responses. Our investigation of the factors that drive borrowers' decisions reveals
that subprime ARMs are not all alike. For loans originated in 2004 and 2005, the interest rate
resets associated with ARMs, as well as the housing and labor market conditions were not as
important in borrowers' delinquency decisions as in their decisions to pay off their loans. For
loans originated in 2006, interest rate resets, housing price declines, and worsening labor market
conditions all contributed importantly to their high delinquency rates. Counterfactual policy
simulations reveal that even if the Labor rate could be lowered to zero by aggressive traditional
monetary policies, it would have a limited effect on reducing the delinquency rates. We find
that automatic modification mortgage designs under which the monthly payment or the principal
balance of the loans are automatically reduced when housing prices decline can be effective
in reducing both delinquency and foreclosure. Importantly, we find that automatic modification
mortgages with a cushion, under which the monthly payment or principal balance reductions are triggered only when housing price declines exceed a certain percentage may result in a Pareto
improvement in that borrowers and lenders are both made better off than under the baseline,
with a lower delinquency and foreclosure rates. Our counterfactual analysis also suggests that
limited commitment power on the part of the lenders to loan modification policies may be an
important reason for the relatively small rate of modifications observed during the housing crisis Download Paper


15040 
Francis J. DiTraglia Camilo GarciaJimeno 
On Mismeasured Binary Regressors: New Results And Some Comments on the Literature, Third Version  
This paper studies the use of a discrete instrumental variable to identify the causal effect of an endogenous, mismeasured, binary treatment. We begin by showing that the only existing identification result for this case, which appears in Mahajan (2006), is incorrect. As such, identification in this model remains an open question. We first prove that the treatment effect is unidentified based on conditional firstmoment information, regardless of the number of values that the instrument may take. We go on to derive a novel partial identification result based on conditional second moments that can be used to test for the presence of misclassification and to construct simple and informative bounds for the treatment effect. In certain special cases, we can in fact obtain point identification of the treatment effect based on second moment information alone. When this is not possible, we show that adding conditional third moment information point identifies the treatment effect and the measurement error process. Download Paper
