Working Papers
By Year:
Paper #  Author  Title  

16012 
Yuichi Yamamoto 
Stochastic Games With Hidden States (Fourth Version)  
This paper studies infinitehorizon stochastic games in which players observe payoffs and noisy public information about a hidden state each period.We find that, very generally, the feasible and individually rational payoff set is invariant to the initial prior about the state in the limit as the discount factor goes to one. This result ensures that players can punish or reward the opponents via continuation payoffs in a flexible way. Then we prove the folk theorem, assuming that public randomization is available. The proof is
constructive, and introduces the idea of random blocks to design an effective punishment mechanism. Download Paper


16011 
David Dillenberger R. Vijay Krishna Philipp Sadowski 
Subjective Dynamics Information Constraints  
We axiomatize a new class of recursive dynamic models that capture subjective constraints on the amount of information a decision maker can obtain, pay attention to, or absorb, via a Markov Decision Process for Information Choice (MIC). An MIC is a subjective decision process that specifies what type of information about the payoffrelevant state is feasible in the current period, and how the choice of what to learn now affects what can be learned in the future. The constraint imposed by the MIC is identified from choice behavior up to a recursive extension of Blackwell dominance. All the other parameters of the model, namely the anticipated evolution of the payoffrelevant state, state dependent consumption utilities, and the discount factor are also uniquely identified. Download Paper


16009 
Yunan Li 
Mechanism Design with Costly Verification and Limited Punishments (Third Version)  
A principal has to allocate a good among a number of agents, each of whom values the good. Each agent has private information about the principal's payoff if he receives the good. There are no monetary transfers. The principal can inspect agents' reports at a cost and penalize them, but the punishments are limited. I characterize an optimal mechanism featuring two thresholds. Agents whose values are below the lower threshold and above the upper threshold are pooled, respectively. If the number of agents is small, then the pooling area at the top of value distribution disappears. If the number of agents is large, then the two pooling areas meet and the optimal mechanism can be implemented via a shortlisting procedure. Download Paper


16008 
Jesus FernandezVillaverde Daniel R. Sanches 
Can Currency Competition Work?  
Can competition among privately issued fiat currencies such as Bitcoin or Ethereum work? Only sometimes. To show this, we build a model of competition among privately issued fiat currencies. We modify the current workhorse of monetary economics, the LagosWright environment, by including entrepreneurs who can issue their own fiat currencies in order to maximize their utility. Otherwise, the model is standard. We show that there exists an equilibrium in which price stability is consistent with competing private monies, but also that there exists a continuum of equilibrium trajectories with the property that the value of private currencies monotonically converges to zero. These latter equilibria disappear, however, when we introduce productive capital. We also investigate the properties of hybrid monetary arrangements with private and government monies,
of automata issuing money, and the role of network effects. Download Paper


16007 
Yunan Li 
Efficient Mechanisms with Information Acquisition  
This paper studies the design of ex ante efficient mechanisms in situations where a single item is for sale, and agents have positively interdependent values and can covertly acquire information at a cost before participating in a mechanism. I find that when interdependency is low and/or the number of agents is large, the ex post efficient mechanism is also ex ante efficient. In cases of high interdependency and/or a small number of agents, ex ante efficient mechanisms discourage agents from acquiring excessive information by introducing randomization to the ex post efficient allocation rule in areas where the information’s precision increases most rapidly. Download Paper


16006 
Hanming Fang Qing Gong 
Detecting Potential Overbilling in Medicare Reimbursement via Hours Worked  
Medicare over billing refers to the phenomenon that providers report more and/or higherintensity service codes than actually delivered to receive higher Medicare reimbursement. We propose a novel and easytoimplement approach to detect potential over billing based on the hours worked implied by the service codes physicians submit to Medicare. Using the Medicare Part B FeeforService (FFS) Physician Utilization and Payment Data in 2012 and 2013 released by the Centers for Medicare and Medicaid Services (CMS), we first construct estimates for physicians' hours spent on Medicare Part B FFS beneficiaries. Despite our deliberately conservative estimation procedure, we find that about 2,300 physicians, or 3% of those with a significant fraction of Medicare Part B FFS services, have billed Medicare over 100 hours per week. We consider this implausibly long hours. As a benchmark, the maximum hours spent on Medicare patients by physicians in National Ambulatory Medical Care Survey data are 50 hours in a week. Interestingly, we also find suggestive evidence that the coding patterns of the flagged physicians seem to be responsive to financial incentives: within code clusters with
different levels of service intensity, they tend to submit more higher intensity service codes than unflagged physicians; moreover, they are more likely to do so if the marginal revenue gain from submitting mid or highintensity codes is relatively high Download Paper


16005 
David Dillenberger Collin Raymond 
GroupShift and the Consensus Effect  
It is well documented that individuals make different choices in the context of group decisions, such as elections, from choices made in isolation. In particular, individuals tend to conform to the decisions of others a property we call the consensus effect  which in turn implies phenomena such as group polarization and the bandwagon effect. We show that the consensus effect is equivalent to a wellknown violation of expected utility, namely strict quasiconvexity of preferences. Our results qualify and extend those of Eliaz, Ray and Razin (2006), who focus on choiceshifts in group when one option is safe (i.e., a degenerate lottery). In contrast to the equilibrium outcome when individuals are expected utility maximizers, the consensus effect implies that group decisions may fail to properly aggregate preferences in strategic contexts and strictly Paretodominated equilibria may arise. Moreover, these problems become more severe as the size of the group grows. Download Paper


16004 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson David Schmeidler 
Economics: Between Prediction and Criticism, Second Version  
We suggest that one way in which economic analysis is useful is by offering a critique of reasoning. According to this view, economic theory may be useful not only by providing predictions, but also by pointing out weaknesses of arguments. It is argued that, when a theory requires a nontrivial act of interpretation, its roles in producing predictions and offering critiques vary in a substantial way. We offer a formal model in which these different roles can be captured. Download Paper


16003 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson 
Memorable Consumption  
People often consume nondurable goods in a way that seems inconsistent with preferences for smoothing consumption over time. We suggest that such patterns of consumption can be better explained if one takes
into account the future utility flows generated by memorable consumption goods, such as a honeymoon or a vacation, whose utility flow outlives their physical consumption. We consider a model in which a consumer
enjoys current consumption as well as utility generated by earlier memorable consumption. Lasting utility flows are generated only by some goods, and only when their consumption exceeds customary levels by a sufficient margin.
We offer axiomatic foundations for the structure of the utility function and study optimal consumption in a dynamic model. We show that rational consumers, taking into account future utility flows, would make optimal
choices that rationalize lumpy patterns of consumption . Download Paper


16002 
Behrang KamaliShahdadi 
Sorting and Peer Effects  
The effect of sorting students based on their academic performances depends not
only on direct peer effects but also on indirect peer effects through teachers' efforts.
Standard assumptions in the literature are insufficient to determine the effect of
sorting on the performances of students and so are silent on the effect of policies such
as tracking, implementing school choice, and voucher programs. We show that the
effect of such policies depends on the curvature of teachers' marginal utility of effort.
We characterize conditions under which sorting increases (decreases) the total effort
of teachers and the average performance of students. Download Paper


16001 
Guido Menzio Leena Rudanko Nicholas Trachter 
Relative Price Dispersion: Evidence and Theory  
We use a large dataset on retail pricing to document that a sizeable portion of the crosssectional variation in the price at which the same good trades in the same period
and in the same market is due to the fact that stores that are, on average, equally expensive set persistently different prices for the same good. We refer to this phenomenon
as relative price dispersion. We argue that relative price dispersion stems from sellers’ attempts to discriminate between highvaluation buyers who need to make all of their
purchases in the same store, and lowvaluation buyers who are willing to purchase different items from different stores. We calibrate our theory and show that it is not only
consistent with the extent and sources of dispersion in the price that different sellers charge for the same good, but also with the extent and sources of dispersion in the
prices that different households pay for the same basket of goods, as well as with the relationship between prices paid and the number of stores visited by different households. Download Paper


15043 
Enrique G. Mendoza Javier Bianchi Chenxin Liu 
Fundamentals News, Global Liquidity and Macroprudential Policy*  
We study optimal macroprudential policy in a model in which unconventional shocks, in the form of news about future fundamentals and regime changes in world interest rates, interact with collateral constraints in driving the dynamics of financial crises. These shocks strengthen incentives to borrow in good times (i.e. when \good news" about future fundamentals coincide with a lowworldinterestrate regime), thereby increasing vulnerability to crises and enlarging the pecuniary externality due to the collateral constraints. Quantitatively, an optimal schedule of macroprudential debt taxes can lower the frequency and magnitude of financial crises, but the policy is complex because it features significant variation across interestrate regimes and news realizations. Download Paper


15042 
Jesus FernandezVillaverde Juan F. RubioRamírez Frank Schorfheide 
Solution and Estimation Methods for DSGE Models  
This chapter provides an overview of solution and estimation techniques for dynamic stochastic general equilibrium (DSGE) models. We cover the foundations of numerical approximation techniques as well as statistical inference and survey the latest developments in the field. Download Paper


15041 
Hanming Fang You Suk Kim Wenli Li 
The Dynamics of AdjustableRate Subprime Mortgage Default: A Structural Estimation  
We present a dynamic structural model of subprime adjustablerate mortgage (ARM) borrowers making payment decisions taking into account possible consequences of different degrees of delinquency from their lenders. We empirically implement the model using unique data sets that contain information on borrowers' mortgage payment history, their broad balance sheets, and lender responses. Our investigation of the factors that drive borrowers' decisions reveals
that subprime ARMs are not all alike. For loans originated in 2004 and 2005, the interest rate
resets associated with ARMs, as well as the housing and labor market conditions were not as
important in borrowers' delinquency decisions as in their decisions to pay off their loans. For
loans originated in 2006, interest rate resets, housing price declines, and worsening labor market
conditions all contributed importantly to their high delinquency rates. Counterfactual policy
simulations reveal that even if the Labor rate could be lowered to zero by aggressive traditional
monetary policies, it would have a limited effect on reducing the delinquency rates. We find
that automatic modification mortgage designs under which the monthly payment or the principal
balance of the loans are automatically reduced when housing prices decline can be effective
in reducing both delinquency and foreclosure. Importantly, we find that automatic modification
mortgages with a cushion, under which the monthly payment or principal balance reductions are triggered only when housing price declines exceed a certain percentage may result in a Pareto
improvement in that borrowers and lenders are both made better off than under the baseline,
with a lower delinquency and foreclosure rates. Our counterfactual analysis also suggests that
limited commitment power on the part of the lenders to loan modification policies may be an
important reason for the relatively small rate of modifications observed during the housing crisis Download Paper


15040 
Francis J. DiTraglia Camilo GarciaJimeno 
On Mismeasured Binary Regressors: New Results And Some Comments on the Literature, Third Version  
This paper studies the use of a discrete instrumental variable to identify the causal effect of an endogenous, mismeasured, binary treatment. We begin by showing that the only existing identification result for this case, which appears in Mahajan (2006), is incorrect. As such, identification in this model remains an open question. We first prove that the treatment effect is unidentified based on conditional firstmoment information, regardless of the number of values that the instrument may take. We go on to derive a novel partial identification result based on conditional second moments that can be used to test for the presence of misclassification and to construct simple and informative bounds for the treatment effect. In certain special cases, we can in fact obtain point identification of the treatment effect based on second moment information alone. When this is not possible, we show that adding conditional third moment information point identifies the treatment effect and the measurement error process. Download Paper


15039 
Francis J. DiTraglia Camilo GarciaJimeno 
On Mismeasured Binary Regressors: New Results And Some Comments on the Literature, Second Version  
This paper studies the use of a discrete instrumental variable to identify the causal effect of a endogenous, mismeasured, binary treatment. We begin by showing that the only existing identification result for this case, which appears in Mahajan (2006), is incorrect. As such, identification in this model remains an open question. We begin by proving that the treatment effect is unidentified based on conditional
firstmoment information, regardless of the number of values that the instrument may take. We go on to derive a novel partial identification result based on conditional second moments that can be used to test for the presence of misclassification and to construct simple and
informative bounds for the treatment effect. In certain special cases, we can in fact obtain point identification of the treatment effect based on second moment information alone. When this is not possible, we show that adding conditional third moment information point identifies the treatment effect and the measurement error process.
Keywords: Instrumental variables, Measurement error, Endogeneity, Binary regressor, Partial Identification
JEL Codes: C10, C18, C25, C26 Download Paper


15038 
Mikhail Golosov Guido Menzio 
Agency Business Cycles  
We propose a new business cycle theory. Firms need to randomize over firing or keeping workers who have performed poorly in the past, in order to give them an exante incentive to exert effort. Firms have an incentive to coordinate the outcome of their randomizations, as coordination allows them to load the firing probability on states of the world in which it is costlier for workers to become unemployed and, hence, allows them to reduce overall agency costs. In the unique robust equilibrium, firms use a sunspot to coordinate the randomization outcomes and the economy experiences endogenous, stochastic aggregate fluctuations.
JEL Codes: D86, E24, E32.
Keywords: Unemployment, Moral Hazard, Endogenous Business Cycles. Download Paper


15037 
Francis J. DiTraglia Camilo GarciaJimeno 
On Mismeasured Binary Regressors: New Results And Some Comments on the Literature  
This Version: November 2, 2015, First Version: October 31, 2015
This paper studies the use of a discrete instrumental variable to identify the causal effect of a endogenous, mismeasured, binary treatment in a homogeneous effects model with additively separable errors. We begin by showing that the only existing identification result for
this case, which appears in Mahajan (2006), is incorrect. As such, identification in this model remains an open question. We provide a convenient notational framework to address this question and use it to derive a number of results. First, we prove that the treatment effect is unidentified based on conditional firstmoment information, regardless of the number of values that the instrument may take. Second, we
derive a novel partial identification result based on conditional second moments that can be used to test for the presence of misclassification and to construct bounds for the treatment effect. In certain special cases, we can in fact obtain point identification of the treatment effect based on second moment information alone. When this is not possible, we show that adding conditional third moment information point identifies the treatment effect and completely characterizes the measurement error process.
Keywords: Instrumental variables, Measurement error, Endogeneity,
Binary regressor, Partial Identification
JEL Codes: C10, C18, C25, C26 Download Paper


15036 
Hanming Fang Rongzhu Ke LiAn Zhou 
Rosca Meets Formal Credit Market  
Rotating Savings and Credit Association (Rosca) is an important informal Financial institution
in many parts of the world used by participants to share income risks. What is the role
of Rosca when formal credit market is introduced? We develop a model in which riskaverse
participants attempt to hedge against their private income shocks with access to both Rosca
and the formal credit and investigate their interactions. Using the gap of the borrowing and
saving interest rates as a measure of the imperfectness of the credit market, we compare three
cases: (i) Rosca without credit market; (ii) Rosca with a perfect credit market; (iii) Rosca with
an imperfect credit market. We show that a perfect credit market completely crowds out the
role of Rosca. However, when credit market is present but imperfect, we show that Rosca and
the formal credit market can complement each other in improving social welfare. Interestingly,
we find that the social welfare in an environment with both Rosca and formal credit market
does not necessarily increase monotonically as the imperfectness of the credit market converges
to zero. Download Paper


15035 
Jesus FernandezVillaverde 
Magna Carta, the Rule of Law, and the Limits on Government  
This paper surveys the legal tradition that links Magna Carta with the modern concepts of the rule of law and the limits on government. It documents that the original understanding of the rule of law included substantive commitments to individual freedom and limited government. Then, it attempts at explaining how and why such commitments were lost to a formalist interpretation of the rule of law from 1848 to 1939. The paper concludes by arguing how a revival of the substantive commitments of the rule of law is central in a project of reshaping modern states. Download Paper


15034 
George J. Mailath Andrew Postlewaite Larry Samuelson 
Premuneration Values and Investments in Matching Markets  
We analyze a model in which agents make investments and then match into pairs to create a surplus. The agents can make transfers to reallocate their pretransfer ownership claims on the surplus. Mailath, Postlewaite, and Samuelson (2013) showed that when investments are unobservable, equilibrium investments are generally inefficient. In this paper we work with a more structured model that is sufficiently tractable to analyze the nature of the investment inefficiencies. We provide conditions under which investment is inefficiently high or low and conditions under which changes in the pretransfer ownership claims on the surplus will be Pareto improving, as well as examine how the degree of heterogeneity on either side of the market affects investment efficiency. Download Paper


15033 
Pablo D'Erasmo Enrique G. Mendoza Jing Zhang 
"What is a Sustainable Public Debt?"  
The question of what is a sustainable public debt is paramount in the macroeconomic analysis of fiscal policy. This question is usually formulated as asking whether the outstanding public debt and its projected path are consistent with those of the government's revenues and expenditures (i.e. whether fiscal solvency conditions hold). We identify critical flaws in the traditional approach to evaluate debt sustainability, and examine three alternative approaches that provide useful econometric and modelsimulation tools to analyze debt sustainability. The first approach is Bohn's nonstructural empirical framework based on a fiscal reaction function that characterizes the dynamics of sustainable debt and primary balances. The second is a structural approach based on a calibrated dynamic general equilibrium framework with a fully specified fiscal sector, which we use to quantify the positive and normative effects of fiscal policies aimed at restoring fiscal solvency in response to changes in debt. The third approach deviates from the others in assuming that governments cannot commit to repay their domestic debt, and can thus optimally decide to default even if debt is sustainable in terms of fiscal solvency. We use these three approaches to analyze debt sustainability in the United States and Europe after the recent surge in public debt following the 2008 crisis, and find that all three raise serious questions about the prospects of fiscal adjustment and its consequences in advanced economies. Download Paper


15032 
Javier Bianchi Enrique G. Mendoza 
"Optimal TimeConsistent Macroprudential Policy"  
Collateral constraints widely used in models of financial crises feature a pecuniary externality: Agents do not internalize how borrowing decisions taken in “good times" affect collateral prices during a crisis. We show that agents in a competitive equilibrium borrow more than a financial regulator who internalizes this externality. We also find, however, that under commitment the regulator's plans are timeinconsistent, and hence focus on studying optimal, timeconsistent policy without commitment. This policy features a statecontingent macroprudential debt tax that is strictly positive at date t if a crisis has positive probability at t + 1. Quantitatively, this policy reduces sharply the frequency and magnitude of crises, removes fat tails from the distribution of returns, and increases social welfare. In contrast, constant debt taxes are ineffective and can be welfarereducing, while an optimized macroprudential Taylor rule" is effective but less so than the optimal policy. Download Paper


15031 
Pablo D'Erasmo Enrique G. Mendoza 
"Distributional Incentives in an Equilibrium Model of Domestic Sovereign Default"  
Europe’s debt crisis resembles historical episodes of outright default on domestic public debt about which little research exists. This paper proposes a theory of domestic
sovereign default based on distributional incentives affecting the welfare of riskaverse debt and non debtholders. A utilitarian government cannot sustain debt if default is
costless. If default is costly, debt with default risk is sustainable, and debt falls as the concentration of debt ownership rises. A government favoring bond holders can also
sustain debt, with debt rising as ownership becomes more concentrated. These results are robust to adding foreign investors, redistributive taxes, or a second asset. Download Paper


15030 
Francis J. DiTraglia Camilo GarciaJimeno 
"A Framework for Eliciting, Incorporating, and Disciplining Identification Beliefs in Linear Models", Third Version  
The identification of causal effects in linear models relies, explicitly and implicitly, on the imposition of researcher beliefs along several dimensions. Assumptions about measurement error, regressor endogeneity, and instrument validity are three key components of any such empirical exercise. Although in practice researchers reason about these three dimensions independently, we show that measurement error, regressor endogeneity and instrument invalidity are mutually constrained by each other and the data in a manner that is only apparent by writing down the full identified set for the model. The nature of this set makes it clear that researcher beliefs over these objects cannot and indeed should not be independent: there are fewer degrees of freedom than parameters. By failing to take this into account, applied researchers both leave money on the table  by failing to incorporate relevant information in estimation  and more importantly risk reasoning to a contradiction by expressing mutually incompatible beliefs. We propose a Bayesian framework to help researchers elicit their beliefs, explicitly incorporate them into estimation and ensure that they are mutually coherent. We illustrate the practical usefulness of our method by applying it to several wellknown papers from the empirical microeconomics literature. Download Paper


15029 
Behrang KamaliShahdadi 
"Matching with Moral Hazard: Assigning Attorneys to Indigent Defendants"  
Each year, over a hundred thousand defendants who are too poor to pay for a lawyer are assigned counsel. Existing procedures for making such assignments are essentially random and have been criticized for giving indigent defendants no say in choosing the counsel they are assigned to. In this paper, we model the problem of assigning counsel to indigent defendants as a matching problem. A novel aspect of this matching problem is the moral hazard component on the part of counsel. Within the model, we show that holding the total expenditure for counsel fixed and changing the matching procedure to accommodate defendants' and attorneys' preferences will make defendants worse off. More precisely, if we switch from random matching to stable matching, defendants become worse off because stable matching exacerbates the moral hazard problem on the part of counsel. In addition, we find conditions on reservation wages of attorneys under which random matching is the efficient way to allocate defendants to counsel. Download Paper


15028 
Francis J. DiTraglia Camilo GarciaJimeno 
"A Framework for Eliciting, Incorporating, and Disciplining Identification Beliefs in Linear Models," Second Version  
The identification of causal effects in linear models relies, explicitly and implicitly, on the imposition of researcher beliefs along several dimensions. Assumptions about measurement error, regressor endogeneity, and instrument validity are three key components of any such empirical exercise. Although in practice researchers reason about these three dimensions independently, we show that measurement error, regressor endogeneity and instrument invalidity are mutually constrained by each other and the data in a manner that is only apparent by writing down the full identified set for the model. The nature of this set makes it clear that researcher beliefs over these objects cannot and indeed should not be independent: there are fewer degrees of freedom than parameters. By failing to take this into account, applied researchers both leave money on the table  by failing to incorporate relevant information in estimation  and more importantly risk reasoning to a contradiction by expressing mutually incompatible beliefs. We propose a Bayesian framework to help researchers elicit their beliefs, explicitly incorporate them into estimation and ensure that they are mutually coherent. We illustrate the practical usefulness of our method by applying it to several wellknown papers from the empirical microeconomics literature. Download Paper


15027 
Francis J. DiTraglia 
"Using Invalid Instruments on Purpose: Focused Moment Selection and Averaging for GMM", Second Version  
Infinite samples, the use of a slightly endogenous but highly relevant instrument can reduce meansquared error (MSE). Building on this observation, I propose a moment selection criterion for GMM in which moment conditions are chosen based on the MSE of their associated estimators rather than their validity: the focused moment selection criterion (FMSC). I then show how the framework used to derive the FMSC can address the problem of inference postmoment selection. Treating postselection estimators as a special case of momentaveraging, in which estimators based on different moment sets are given datadependent weights, I propose a simulationbased procedure to construct valid confidence intervals for a variety of formal and informal momentselection and averaging procedures. Both the FMSC and confidence interval procedure perform well in simulations. I conclude with an empirical example examining the effect of instrument selection on the estimated relationship between malaria transmission and income. Download Paper


15026A 
Patrick DeJarnette David Dillenberger Daniel Gottlieb Pietro Ortoleva 
"Time Lotteries: Online Appendix"  
This online appendix provides additional proofs, extensions, and all experiment instructions and questionnaire. Download Paper


15026 
Patrick DeJarnette David Dillenberger Daniel Gottlieb Pietro Ortoleva 
"Time Lotteries"  
We study preferences over lotteries that pay a specific prize at uncertain dates. Expected Utility with convex discounting implies that individuals prefer receiving $x in a random date with mean t over receiving $x in t days for sure. Our experiment rejects this prediction. It suggests a link between preferences for payments at certain dates and standard risk aversion. EpsteinZin (1989) preferences accommodate such behavior, and fit the data better than a model with probability weighting. We thus provide another justification for disentangling attitudes toward risk and time, as in EpsteinZin, and suggest new theoretical restrictions on its key parameters. Download Paper


15025 
Mert Demirer Francis X. Diebold Laura Liu Kamil Yilmaz 
“Estimating Global Bank Network Connectedness”  
We use lasso methods to shrink, select and estimate the network linking the publiclytraded subset of the world’s top 150 banks, 20032014. We characterize static network connectedness using fullsample estimation and dynamic network connectedness using rollingwindow estimation. Statistically, we find that global banking connectedness is clearly linked to bank location, not bank assets. Dynamically, we find that global banking connectedness displays both secular and cyclical variation. The secular variation corresponds to gradual increases/decreases during episodes of gradual increases/decreases in global market integration. The cyclical variation corresponds to sharp increases during crises, involving mostly crosscountry, as opposed to withincountry, bank linkages. Download Paper


15024 
Naoki Aizawa Hanming Fang 
"Equilibrium Labor Market Search and Health Insurance Reform, Second Version"  
We present and empirically implement an equilibrium labor market search model where risk averse workers facing medical expenditure shocks are matched with firms making health insurance coverage decisions. Our model delivers a rich set of predictions that can account for a wide variety of phenomenon observed in the data including the correlations among firm sizes, wages, health insurance offering rates, turnover rates and workers' health compositions. We estimate our model by Generalized Method of Moments using a combination of micro datasets including Survey of Income and Program Participation, Medical Expenditure Panel Survey and Robert Wood Johnson Foundation Employer Health Insurance Survey. We use our estimated model to evaluate the equilibrium impact of the 2010 Affordable Care Act (ACA) and find that it would reduce the uninsured rate among the workers in our estimation sample from about 22% in the preACA benchmark economy to less than 4%. We also find that incomebased premium subsidies for health insurance purchases from the exchange play an important role for the sustainability of the ACA; without the premium subsidies, the uninsured rate would be around 18%. In contrast, as long as premium subsidies and health insurance exchanges with community ratings stay intact, ACA without the individual mandate, or without the employer mandate, or without both mandates, could still succeed in reducing the uninsured rates to 7.34%, 4.63% and 9.22% respectively. Download Paper


15023 
Richard P. McLean Andrew Postlewaite 
"A Dynamic Nondirect Implementation Mechanism for Interdependent Value Problems", Second Version  
Much of the literature on mechanism design and implementation uses the revelation principle to restrict attention to direct mechanisms. This is without loss of generality in a well defined sense. It is, however, restrictive if one is concerned with the set of equilibria, if one is concerned about the size of messages that will be sent, or if one is concerned about privacy. We showed in McLean and Postlewaite (2014) that when agents are informationally small, there exist small modifications to VCG mechanisms in interdependent value problems that restore incentive compatibility. We show here how one can construct a twostage mechanism that similarly restores incentive compatibility while improving upon the direct one stage mechanism in terms of privacy and the size of messages that must be sent. The first stage essentially elicits that part of the agents' private information that induces interdependence and reveals it to all agents, transforming the interdependent value problem into a private value problem. The second stage is a VCG mechanism for the now private value problem. Agents typically need to transmit substantially less information in the two stage mechanism than would be necessary for a single stage mechanism. Lastly, the first stage that elicits the part of the agents' private information that induces interdependence can be used to transform certain other interdependent value problems into private value problems. Download Paper


15022 
Aislinn Bohren 
"Informational Herding with Model Misspecification, Second Version"  
This paper demonstrates that a misspecified model of information processing interferes with longrun learning and allows inefficient choices to persist in the face of contradictory public information. I consider an observational learning environment where agents observe a private signal about a hidden state, and some agents observe the actions of their predecessors. Prior actions aggregate multiple sources of correlated information about the state, and agents face an inferential challenge to distinguish between new and redundant information. When individuals significantly overestimate the amount of new information, beliefs about the state become entrenched and incorrect learning may occur. When individuals sufficiently overestimate the amount of redundant information, beliefs are fragile and learning is incomplete. Learning is complete when agents have an approximately correct model of inference, establishing that the correct model is robust to perturbation. These results have important implications for timing, frequency and strength of policy interventions to facilitate learning. Download Paper


15021 
Sarah Baird Aislinn Bohren Craig McIntosh Berk Ozler 
"Designing Experiments to Measure Spillover Effects, Second Version"  
This paper formalizes the design of experiments intended specifically to study spillover effects. By first randomizing the intensity of treatment within clusters and then randomly assigning individual treatment conditional on this clusterlevel intensity, a novel set of treatment effects can be identified. We develop a formal framework for consistent estimation of these effects, and provide explicit expressions for power
calculations. We show that the power to detect average treatment effects declines precisely with the quantity that identifies the novel treatment effects. A demonstration of the technique is provided using a cash transfer program in Malawi. Download Paper
