Working Papers
By Year:
Paper #  Author  Title  

16025 
Sarah Baird Aislinn Bohren Craig McIntosh Berk Ozler 
Optimal Design of Experiments in the Presence of Interference* (Second Version)  
This paper formalizes the optimal design of randomized controlled trials (RCTs) in the presence of interference between units, where an individual's outcome depends on the behavior and outcomes of others in her group. We focus on randomized saturation (RS) designs, which are twostage RCTs that first randomize the treatment saturation of a group, then randomize individual treatment assignment. Our main contributions are to map the potential outcomes framework with partial interference to a regression model with clustered errors, calculate the statistical power of different RS designs, and derive analytical insights for how to optimally design an RS experiment. We show that the power to detect average treatment effects declines precisely with the ability to identify novel treatment and spillover estimands, such as how effects vary with the intensity of treatment. We provide software that assists researchers in designing RS experiments. Download Paper


16024 
Aislinn Bohren 
Using Persistence to Generate Incentives in a Dynamic Moral Hazard Problem  
This paper studies how persistence can be used to create incentives in a
continuoustime stochastic game in which a longrun player interacts with a se
quence of shortrun players. Observation of the longrun player's actions are
distorted by a Brownian motion and the actions of both players impact future
payoffs through a state variable. For example, a firm or worker provides customers with a product, and the quality of this product depends on both current
and past investment choices by the firm. I derive general conditions under which
a Markov equilibrium emerges as the unique perfect public equilibrium, and char
acterize the equilibrium payoff and actions in this equilibrium, for any discount
rate. I develop an application of persistent product quality to illustrate how per
sistence creates effective intertemporal incentives in a setting where traditional
channels fail, and explore how the structure of persistence impacts equilibrium
behavior. This demonstrates the power of the continuoustime setting to deliver
sharp insights and a tractable equilibrium characterization for a rich class of
dynamic games. Download Paper


16023 
Aislinn Bohren Troy Kravitz 
Optimal Contracting with Costly State Verification, with an Application to Crowdsourcing  
A firm employs workers to obtain costly unverifiable information  for example, categorizing the content of images. Workers are monitored by comparing their messages. The optimal contract under limited liability exhibits three key features: (i) the monitoring technology depends crucially on the commitment power of the firm  virtual monitoring, or monitoring with arbitrarily small probability, is optimal when the firm can commit to truthfully reveal messages from other workers, while monitoring with strictly positive probability is optimal when the firm can hide messages (partial commitment), (ii) bundling multiple tasks reduces worker rents
and monitoring inefficiencies; and (iii) the optimal contract is approximately efficient under full but not partial commitment. We conclude with an application to crowdsourcing platforms, and characterize the optimal contract for tasks found on these platforms. Download Paper


16022 
Laura Liu Hyungsik Roger Moon Frank Schorfheide 
Forecasting with Dynamic Panel Data Models  
This paper considers the problem of forecasting a collection of short time series
using cross sectional information in panel data. We construct point predictors using
Tweedie's formula for the posterior mean of heterogeneous coefficients under a correlated random effects distribution. This formula utilizes crosssectional information to transform the unitspecific (quasi) maximum likelihood estimator into an approximation of the posterior mean under a prior distribution that equals the population distribution of the random coefficients. We show that the risk of a predictor based on a nonparametric estimate of the Tweedie correction is asymptotically equivalent to the risk of a predictor that treats the correlatedrandomeffects distribution as known (ratiooptimality). Our empirical Bayes predictor performs well compared to various competitors in a Monte Carlo study. In an empirical application we use the predictor to forecast revenues for a large panel of bank holding companies and compare forecasts that condition on actual and severely adverse macroeconomic conditions. Download Paper


16021 
R. Jason Faberman Guido Menzio 
Evidence on the Relationship between Recruiting and the Starting Wage  
Using data from the Employment Opportunity Pilot Project, we examine the relationship between the starting wage paid to the worker filling a vacancy, the number of applications attracted by the vacancy, the number of candidates interviewed for the vacancy, and the duration of the vacancy. We find that the wage is positively related to the duration of a vacancy and negatively related to the number of applications and interviews per week. We show that these surprising findings are consistent with a view of the labor market in which firms post wages and workers direct their search based on these wages if workers and jobs are heterogeneous and the interaction between worker’s type and job’s type in production satisfies some rather natural assumptions. Download Paper


16020 
Enrique G. Mendoza 
Macroprudential Policy: Promise and Challenges  
Macroprudential policy holds the promise of becoming a powerful tool for preventing financial crises. Financial amplification in response to domestic shocks or global spillovers and pecuniary externalities caused by Fisherian collateral constraints provide a sound theoretical foundation for this policy. Quantitative studies show that models with these constraints replicate key stylized facts of financial crises, and that the optimal financial policy of an ideal constrainedefficient social planner reduces sharply the magnitude and frequency of crises. Research also shows, however, that implementing effective macroprudential policy still faces serious hurdles. This paper highlights three of them: (i) complexity, because the optimal policy responds widely and nonlinearly to movements in both domestic factors and global spillovers due to regime shifts in global liquidity, news about global fundamentals, and recurrent innovation and regulatory changes in world markets, (ii) lack of credibility, because of timeinconsistency of the optimal policy under commitment, and (iii) coordination failure, because a careful balance with monetary policy is needed to avoid quantitatively large inefficiencies resulting from violations of Tinbergen’s rule or strategic interaction between monetary and financial authorities. Download Paper


16019 
Pablo D'Erasmo Enrique G. Mendoza 
Optimal Domestic (and External) Sovereign Default  
Infrequent but turbulent episodes of outright sovereign default on domestic creditors are considered a “forgotten history” in Macroeconomics. We propose a heterogeneousagents model in which optimal debt and default on domestic and foreign creditors are driven by distributional incentives and endogenous default costs due to value of debt for selfinsurance, liquidity and risksharing. The government’s aim to redistribute resources across agents and through time in response to uninsurable shocks produces a rich dynamic feedback mechanism linking debt issuance, the distribution of government bond holdings, the default decision, and risk premia. Calibrated to Spanish data, the model is consistent with key cyclical comovements and features of debtcrisis dynamics. Debt exhibits protracted fluctuations. Defaults have a low frequency of 0.93 percent, are preceded by surging debt and spreads, and occur with relatively low external debt. Default risk limits the sustainable debt and yet spreads are zero most of the time. Download Paper


16018 
George J. Mailath Stephen Morris Andrew Postlewaite 
Laws and Authority  
A law prohibiting a particular behavior does not directly change the payoff to an individual should he engage in the prohibited behavior. Rather, any change in the individual’s payoff, should he engage in the prohibited behavior, is a consequence of changes in other peoples’ behavior. If laws do not directly change payoffs, they are “cheap talk,” and can only affect behavior because people have coordinated beliefs about the effects of the law. Beginning from this point of view, we provide definitions of authority in a variety of problems, and investigate how and when individuals can have, gain, and lose authority. Download Paper


16017 
Edward Herbst Frank Schorfheide 
Tempered Particle Filtering  
The accuracy of particle filters for nonlinear statespace models crucially depends on the proposal distribution that mutates time t − 1 particle values into time t values. In the widelyused bootstrap particle filter this distribution is generated by the state transition equation. While straightforward to implement, the practical performance is often poor. We develop a selftuning particle filter in which the proposal distribution is constructed adaptively through a sequence of Monte Carlo steps. Intuitively, we start from a measurement error distribution with an inflated variance, and then gradually reduce the variance to its nominal level in a sequence of steps that we call tempering. We show that the filter generates an unbiased and consistent approximation of the likelihood function. Holding the run time fixed, our filter is substantially more accurate in two DSGE model applications than the bootstrap particle filter. Download Paper


16016 
Hanming Fang Zenan Wu 
Multidimensional Private Information, Market Structure and Insurance Markets  
A large empirical literature found that the correlation between insurance purchase and ex post realization of risk is often statistically insignificant or negative. This is inconsistent with the predictions from the classic models of insurance a la Akerlof (1970), Pauly (1974) and Rothschild and Stiglitz (1976) where consumers have onedimensional heterogeneity in their risk types. It is suggested that selection based on multidimensional private information, e.g., risk and risk preference types, may be able to explain the empirical findings. In this paper, we systematically investigate whether selection based on multidimensional private information in risk and risk preferences, can, under different market structures, result in a negative correlation in equilibrium between insurance coverage and ex post realization of risk. We show that if the insurance market is perfectly competitive, selection based on multidimensional private information does not result in negative correlation property in equilibrium, unless there is a sufficiently high loading factor. If the insurance market is monopolistic or imperfectly competitive, however, we show that it is possible to generate negative correlation property in equilibrium when risk and risk preference types are sufficiently negative dependent, a notion we formalize using the concept of copula. We also clarify the connections between some of the important concepts such as adverse/advantageous selection and positive/negative correlation property. Download Paper


16015 
David Dillenberger Collin Raymond 
GroupShift and the Consensus Effect, Second Version  
Individuals often tend to conform to the choices of others in group decisions, compared to choices made in isolation, giving rise to phenomena such as group polarization and the bandwagon effect. We show that this behavior, which we term the consensus effect, is equivalent to a wellknown violation of expected utility, namely strict quasiconvexity of preferences. In contrast to the equilibrium outcome when individuals are expected utility maximizers, quasiconvexity of preferences imply that group decisions may fail to properly aggregate preferences and strictly Paretodominated equilibria may arise. Moreover, these problems become more severe as the size of the group grows. Download Paper


16014 
Daniel N. Hauser 
Promoting a Reputation for Quality  
I consider a model in which a firm invests in both product quality and in a costly
signaling technology, and the firm's reputation is the market's belief that its quality
is high. The firm influences the rate at which consumers receive information about
quality: the firm can either promote, which increases the arrival rate of signals
when quality is high, or censor, which decreases the arrival rate of signals when
quality is low. I study how the firm's incentives to build quality and signal depend
on its reputation and current quality. The firm's ability to promote or censor plays
a key role in the structure of equilibria. Promotion and investment in quality are
complements: the firm has stronger incentives to build quality when the promotion
level is high. Costly promotion can, however, reduce the firm's incentive to build
quality; this effect persists even as the cost of building quality approaches zero.
Censorship and investment in quality are substitutes. The ability to censor can
destroy a firm's incentives to invest in quality, because it can reduce information
about poor quality products. Download Paper


16013 
Venkataraman Bhaskar George J. Mailath 
The Curse of Long Horizons  
We study dynamic moral hazard with symmetric ex ante uncertainty about the difficulty of the job. The principal and agent update their beliefs about the difficulty as they observe output. Effort is private and the principal can only offer spot contracts. The agent has an additional incentive to shirk beyond the disutility of effort when the principal induces effort: shirking results in the principal having incorrect beliefs. We show that the effort inducing contract must provide
increasingly high powered incentives as the length of the relationship increases. Thus it is never optimal to always induce effort in very long relationships. Download Paper


16012 
Yuichi Yamamoto 
Stochastic Games With Hidden States (Fourth Version)  
This paper studies infinitehorizon stochastic games in which players observe payoffs and noisy public information about a hidden state each period.We find that, very generally, the feasible and individually rational payoff set is invariant to the initial prior about the state in the limit as the discount factor goes to one. This result ensures that players can punish or reward the opponents via continuation payoffs in a flexible way. Then we prove the folk theorem, assuming that public randomization is available. The proof is
constructive, and introduces the idea of random blocks to design an effective punishment mechanism. Download Paper


16011 
David Dillenberger R. Vijay Krishna Philipp Sadowski 
Subjective Dynamics Information Constraints  
We axiomatize a new class of recursive dynamic models that capture subjective constraints on the amount of information a decision maker can obtain, pay attention to, or absorb, via a Markov Decision Process for Information Choice (MIC). An MIC is a subjective decision process that specifies what type of information about the payoffrelevant state is feasible in the current period, and how the choice of what to learn now affects what can be learned in the future. The constraint imposed by the MIC is identified from choice behavior up to a recursive extension of Blackwell dominance. All the other parameters of the model, namely the anticipated evolution of the payoffrelevant state, state dependent consumption utilities, and the discount factor are also uniquely identified. Download Paper


16009 
Yunan Li 
Mechanism Design with Costly Verification and Limited Punishments (Third Version)  
A principal has to allocate a good among a number of agents, each of whom values the good. Each agent has private information about the principal's payoff if he receives the good. There are no monetary transfers. The principal can inspect agents' reports at a cost and penalize them, but the punishments are limited. I characterize an optimal mechanism featuring two thresholds. Agents whose values are below the lower threshold and above the upper threshold are pooled, respectively. If the number of agents is small, then the pooling area at the top of value distribution disappears. If the number of agents is large, then the two pooling areas meet and the optimal mechanism can be implemented via a shortlisting procedure. Download Paper


16008 
Jesus FernandezVillaverde Daniel R. Sanches 
Can Currency Competition Work?  
Can competition among privately issued fiat currencies such as Bitcoin or Ethereum work? Only sometimes. To show this, we build a model of competition among privately issued fiat currencies. We modify the current workhorse of monetary economics, the LagosWright environment, by including entrepreneurs who can issue their own fiat currencies in order to maximize their utility. Otherwise, the model is standard. We show that there exists an equilibrium in which price stability is consistent with competing private monies, but also that there exists a continuum of equilibrium trajectories with the property that the value of private currencies monotonically converges to zero. These latter equilibria disappear, however, when we introduce productive capital. We also investigate the properties of hybrid monetary arrangements with private and government monies,
of automata issuing money, and the role of network effects. Download Paper


16007 
Yunan Li 
Efficient Mechanisms with Information Acquisition  
This paper studies the design of ex ante efficient mechanisms in situations where a single item is for sale, and agents have positively interdependent values and can covertly acquire information at a cost before participating in a mechanism. I find that when interdependency is low and/or the number of agents is large, the ex post efficient mechanism is also ex ante efficient. In cases of high interdependency and/or a small number of agents, ex ante efficient mechanisms discourage agents from acquiring excessive information by introducing randomization to the ex post efficient allocation rule in areas where the information’s precision increases most rapidly. Download Paper


16006 
Hanming Fang Qing Gong 
Detecting Potential Overbilling in Medicare Reimbursement via Hours Worked  
Medicare over billing refers to the phenomenon that providers report more and/or higherintensity service codes than actually delivered to receive higher Medicare reimbursement. We propose a novel and easytoimplement approach to detect potential over billing based on the hours worked implied by the service codes physicians submit to Medicare. Using the Medicare Part B FeeforService (FFS) Physician Utilization and Payment Data in 2012 and 2013 released by the Centers for Medicare and Medicaid Services (CMS), we first construct estimates for physicians' hours spent on Medicare Part B FFS beneficiaries. Despite our deliberately conservative estimation procedure, we find that about 2,300 physicians, or 3% of those with a significant fraction of Medicare Part B FFS services, have billed Medicare over 100 hours per week. We consider this implausibly long hours. As a benchmark, the maximum hours spent on Medicare patients by physicians in National Ambulatory Medical Care Survey data are 50 hours in a week. Interestingly, we also find suggestive evidence that the coding patterns of the flagged physicians seem to be responsive to financial incentives: within code clusters with
different levels of service intensity, they tend to submit more higher intensity service codes than unflagged physicians; moreover, they are more likely to do so if the marginal revenue gain from submitting mid or highintensity codes is relatively high Download Paper


16005 
David Dillenberger Collin Raymond 
GroupShift and the Consensus Effect  
It is well documented that individuals make different choices in the context of group decisions, such as elections, from choices made in isolation. In particular, individuals tend to conform to the decisions of others a property we call the consensus effect  which in turn implies phenomena such as group polarization and the bandwagon effect. We show that the consensus effect is equivalent to a wellknown violation of expected utility, namely strict quasiconvexity of preferences. Our results qualify and extend those of Eliaz, Ray and Razin (2006), who focus on choiceshifts in group when one option is safe (i.e., a degenerate lottery). In contrast to the equilibrium outcome when individuals are expected utility maximizers, the consensus effect implies that group decisions may fail to properly aggregate preferences in strategic contexts and strictly Paretodominated equilibria may arise. Moreover, these problems become more severe as the size of the group grows. Download Paper


16004 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson David Schmeidler 
Economics: Between Prediction and Criticism, Second Version  
We suggest that one way in which economic analysis is useful is by offering a critique of reasoning. According to this view, economic theory may be useful not only by providing predictions, but also by pointing out weaknesses of arguments. It is argued that, when a theory requires a nontrivial act of interpretation, its roles in producing predictions and offering critiques vary in a substantial way. We offer a formal model in which these different roles can be captured. Download Paper


16003 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson 
Memorable Consumption  
People often consume nondurable goods in a way that seems inconsistent with preferences for smoothing consumption over time. We suggest that such patterns of consumption can be better explained if one takes
into account the future utility flows generated by memorable consumption goods, such as a honeymoon or a vacation, whose utility flow outlives their physical consumption. We consider a model in which a consumer
enjoys current consumption as well as utility generated by earlier memorable consumption. Lasting utility flows are generated only by some goods, and only when their consumption exceeds customary levels by a sufficient margin.
We offer axiomatic foundations for the structure of the utility function and study optimal consumption in a dynamic model. We show that rational consumers, taking into account future utility flows, would make optimal
choices that rationalize lumpy patterns of consumption . Download Paper


16002 
Behrang KamaliShahdadi 
Sorting and Peer Effects  
The effect of sorting students based on their academic performances depends not
only on direct peer effects but also on indirect peer effects through teachers' efforts.
Standard assumptions in the literature are insufficient to determine the effect of
sorting on the performances of students and so are silent on the effect of policies such
as tracking, implementing school choice, and voucher programs. We show that the
effect of such policies depends on the curvature of teachers' marginal utility of effort.
We characterize conditions under which sorting increases (decreases) the total effort
of teachers and the average performance of students. Download Paper


16001 
Guido Menzio Leena Rudanko Nicholas Trachter 
Relative Price Dispersion: Evidence and Theory  
We use a large dataset on retail pricing to document that a sizeable portion of the crosssectional variation in the price at which the same good trades in the same period
and in the same market is due to the fact that stores that are, on average, equally expensive set persistently different prices for the same good. We refer to this phenomenon
as relative price dispersion. We argue that relative price dispersion stems from sellers’ attempts to discriminate between highvaluation buyers who need to make all of their
purchases in the same store, and lowvaluation buyers who are willing to purchase different items from different stores. We calibrate our theory and show that it is not only
consistent with the extent and sources of dispersion in the price that different sellers charge for the same good, but also with the extent and sources of dispersion in the
prices that different households pay for the same basket of goods, as well as with the relationship between prices paid and the number of stores visited by different households. Download Paper


15043 
Enrique G. Mendoza Javier Bianchi Chenxin Liu 
Fundamentals News, Global Liquidity and Macroprudential Policy*  
We study optimal macroprudential policy in a model in which unconventional shocks, in the form of news about future fundamentals and regime changes in world interest rates, interact with collateral constraints in driving the dynamics of financial crises. These shocks strengthen incentives to borrow in good times (i.e. when \good news" about future fundamentals coincide with a lowworldinterestrate regime), thereby increasing vulnerability to crises and enlarging the pecuniary externality due to the collateral constraints. Quantitatively, an optimal schedule of macroprudential debt taxes can lower the frequency and magnitude of financial crises, but the policy is complex because it features significant variation across interestrate regimes and news realizations. Download Paper


15042 
Jesus FernandezVillaverde Juan F. RubioRamírez Frank Schorfheide 
Solution and Estimation Methods for DSGE Models  
This chapter provides an overview of solution and estimation techniques for dynamic stochastic general equilibrium (DSGE) models. We cover the foundations of numerical approximation techniques as well as statistical inference and survey the latest developments in the field. Download Paper


15041 
Hanming Fang You Suk Kim Wenli Li 
The Dynamics of AdjustableRate Subprime Mortgage Default: A Structural Estimation  
We present a dynamic structural model of subprime adjustablerate mortgage (ARM) borrowers making payment decisions taking into account possible consequences of different degrees of delinquency from their lenders. We empirically implement the model using unique data sets that contain information on borrowers' mortgage payment history, their broad balance sheets, and lender responses. Our investigation of the factors that drive borrowers' decisions reveals
that subprime ARMs are not all alike. For loans originated in 2004 and 2005, the interest rate
resets associated with ARMs, as well as the housing and labor market conditions were not as
important in borrowers' delinquency decisions as in their decisions to pay off their loans. For
loans originated in 2006, interest rate resets, housing price declines, and worsening labor market
conditions all contributed importantly to their high delinquency rates. Counterfactual policy
simulations reveal that even if the Labor rate could be lowered to zero by aggressive traditional
monetary policies, it would have a limited effect on reducing the delinquency rates. We find
that automatic modification mortgage designs under which the monthly payment or the principal
balance of the loans are automatically reduced when housing prices decline can be effective
in reducing both delinquency and foreclosure. Importantly, we find that automatic modification
mortgages with a cushion, under which the monthly payment or principal balance reductions are triggered only when housing price declines exceed a certain percentage may result in a Pareto
improvement in that borrowers and lenders are both made better off than under the baseline,
with a lower delinquency and foreclosure rates. Our counterfactual analysis also suggests that
limited commitment power on the part of the lenders to loan modification policies may be an
important reason for the relatively small rate of modifications observed during the housing crisis Download Paper


15040 
Francis J. DiTraglia Camilo GarciaJimeno 
On Mismeasured Binary Regressors: New Results And Some Comments on the Literature, Third Version  
This paper studies the use of a discrete instrumental variable to identify the causal effect of an endogenous, mismeasured, binary treatment. We begin by showing that the only existing identification result for this case, which appears in Mahajan (2006), is incorrect. As such, identification in this model remains an open question. We first prove that the treatment effect is unidentified based on conditional firstmoment information, regardless of the number of values that the instrument may take. We go on to derive a novel partial identification result based on conditional second moments that can be used to test for the presence of misclassification and to construct simple and informative bounds for the treatment effect. In certain special cases, we can in fact obtain point identification of the treatment effect based on second moment information alone. When this is not possible, we show that adding conditional third moment information point identifies the treatment effect and the measurement error process. Download Paper


15039 
Francis J. DiTraglia Camilo GarciaJimeno 
On Mismeasured Binary Regressors: New Results And Some Comments on the Literature, Second Version  
This paper studies the use of a discrete instrumental variable to identify the causal effect of a endogenous, mismeasured, binary treatment. We begin by showing that the only existing identification result for this case, which appears in Mahajan (2006), is incorrect. As such, identification in this model remains an open question. We begin by proving that the treatment effect is unidentified based on conditional
firstmoment information, regardless of the number of values that the instrument may take. We go on to derive a novel partial identification result based on conditional second moments that can be used to test for the presence of misclassification and to construct simple and
informative bounds for the treatment effect. In certain special cases, we can in fact obtain point identification of the treatment effect based on second moment information alone. When this is not possible, we show that adding conditional third moment information point identifies the treatment effect and the measurement error process.
Keywords: Instrumental variables, Measurement error, Endogeneity, Binary regressor, Partial Identification
JEL Codes: C10, C18, C25, C26 Download Paper


15038 
Mikhail Golosov Guido Menzio 
Agency Business Cycles  
We propose a new business cycle theory. Firms need to randomize over firing or keeping workers who have performed poorly in the past, in order to give them an exante incentive to exert effort. Firms have an incentive to coordinate the outcome of their randomizations, as coordination allows them to load the firing probability on states of the world in which it is costlier for workers to become unemployed and, hence, allows them to reduce overall agency costs. In the unique robust equilibrium, firms use a sunspot to coordinate the randomization outcomes and the economy experiences endogenous, stochastic aggregate fluctuations.
JEL Codes: D86, E24, E32.
Keywords: Unemployment, Moral Hazard, Endogenous Business Cycles. Download Paper


15037 
Francis J. DiTraglia Camilo GarciaJimeno 
On Mismeasured Binary Regressors: New Results And Some Comments on the Literature  
This Version: November 2, 2015, First Version: October 31, 2015
This paper studies the use of a discrete instrumental variable to identify the causal effect of a endogenous, mismeasured, binary treatment in a homogeneous effects model with additively separable errors. We begin by showing that the only existing identification result for
this case, which appears in Mahajan (2006), is incorrect. As such, identification in this model remains an open question. We provide a convenient notational framework to address this question and use it to derive a number of results. First, we prove that the treatment effect is unidentified based on conditional firstmoment information, regardless of the number of values that the instrument may take. Second, we
derive a novel partial identification result based on conditional second moments that can be used to test for the presence of misclassification and to construct bounds for the treatment effect. In certain special cases, we can in fact obtain point identification of the treatment effect based on second moment information alone. When this is not possible, we show that adding conditional third moment information point identifies the treatment effect and completely characterizes the measurement error process.
Keywords: Instrumental variables, Measurement error, Endogeneity,
Binary regressor, Partial Identification
JEL Codes: C10, C18, C25, C26 Download Paper


15036 
Hanming Fang Rongzhu Ke LiAn Zhou 
Rosca Meets Formal Credit Market  
Rotating Savings and Credit Association (Rosca) is an important informal Financial institution
in many parts of the world used by participants to share income risks. What is the role
of Rosca when formal credit market is introduced? We develop a model in which riskaverse
participants attempt to hedge against their private income shocks with access to both Rosca
and the formal credit and investigate their interactions. Using the gap of the borrowing and
saving interest rates as a measure of the imperfectness of the credit market, we compare three
cases: (i) Rosca without credit market; (ii) Rosca with a perfect credit market; (iii) Rosca with
an imperfect credit market. We show that a perfect credit market completely crowds out the
role of Rosca. However, when credit market is present but imperfect, we show that Rosca and
the formal credit market can complement each other in improving social welfare. Interestingly,
we find that the social welfare in an environment with both Rosca and formal credit market
does not necessarily increase monotonically as the imperfectness of the credit market converges
to zero. Download Paper


15035 
Jesus FernandezVillaverde 
Magna Carta, the Rule of Law, and the Limits on Government  
This paper surveys the legal tradition that links Magna Carta with the modern concepts of the rule of law and the limits on government. It documents that the original understanding of the rule of law included substantive commitments to individual freedom and limited government. Then, it attempts at explaining how and why such commitments were lost to a formalist interpretation of the rule of law from 1848 to 1939. The paper concludes by arguing how a revival of the substantive commitments of the rule of law is central in a project of reshaping modern states. Download Paper


15034 
George J. Mailath Andrew Postlewaite Larry Samuelson 
Premuneration Values and Investments in Matching Markets  
We analyze a model in which agents make investments and then match into pairs to create a surplus. The agents can make transfers to reallocate their pretransfer ownership claims on the surplus. Mailath, Postlewaite, and Samuelson (2013) showed that when investments are unobservable, equilibrium investments are generally inefficient. In this paper we work with a more structured model that is sufficiently tractable to analyze the nature of the investment inefficiencies. We provide conditions under which investment is inefficiently high or low and conditions under which changes in the pretransfer ownership claims on the surplus will be Pareto improving, as well as examine how the degree of heterogeneity on either side of the market affects investment efficiency. Download Paper


15033 
Pablo D'Erasmo Enrique G. Mendoza Jing Zhang 
"What is a Sustainable Public Debt?"  
The question of what is a sustainable public debt is paramount in the macroeconomic analysis of fiscal policy. This question is usually formulated as asking whether the outstanding public debt and its projected path are consistent with those of the government's revenues and expenditures (i.e. whether fiscal solvency conditions hold). We identify critical flaws in the traditional approach to evaluate debt sustainability, and examine three alternative approaches that provide useful econometric and modelsimulation tools to analyze debt sustainability. The first approach is Bohn's nonstructural empirical framework based on a fiscal reaction function that characterizes the dynamics of sustainable debt and primary balances. The second is a structural approach based on a calibrated dynamic general equilibrium framework with a fully specified fiscal sector, which we use to quantify the positive and normative effects of fiscal policies aimed at restoring fiscal solvency in response to changes in debt. The third approach deviates from the others in assuming that governments cannot commit to repay their domestic debt, and can thus optimally decide to default even if debt is sustainable in terms of fiscal solvency. We use these three approaches to analyze debt sustainability in the United States and Europe after the recent surge in public debt following the 2008 crisis, and find that all three raise serious questions about the prospects of fiscal adjustment and its consequences in advanced economies. Download Paper
