Working Papers
By Year:
Paper #  Author  Title  

15032 
Javier Bianchi Enrique G. Mendoza 
"Optimal TimeConsistent Macroprudential Policy"  
Collateral constraints widely used in models of financial crises feature a pecuniary externality: Agents do not internalize how borrowing decisions taken in “good times" affect collateral prices during a crisis. We show that agents in a competitive equilibrium borrow more than a financial regulator who internalizes this externality. We also find, however, that under commitment the regulator's plans are timeinconsistent, and hence focus on studying optimal, timeconsistent policy without commitment. This policy features a statecontingent macroprudential debt tax that is strictly positive at date t if a crisis has positive probability at t + 1. Quantitatively, this policy reduces sharply the frequency and magnitude of crises, removes fat tails from the distribution of returns, and increases social welfare. In contrast, constant debt taxes are ineffective and can be welfarereducing, while an optimized macroprudential Taylor rule" is effective but less so than the optimal policy. Download Paper


15031 
Pablo D'Erasmo Enrique G. Mendoza 
"Distributional Incentives in an Equilibrium Model of Domestic Sovereign Default"  
Europe’s debt crisis resembles historical episodes of outright default on domestic public debt about which little research exists. This paper proposes a theory of domestic
sovereign default based on distributional incentives affecting the welfare of riskaverse debt and non debtholders. A utilitarian government cannot sustain debt if default is
costless. If default is costly, debt with default risk is sustainable, and debt falls as the concentration of debt ownership rises. A government favoring bond holders can also
sustain debt, with debt rising as ownership becomes more concentrated. These results are robust to adding foreign investors, redistributive taxes, or a second asset. Download Paper


15030 
Francis J. DiTraglia Camilo GarciaJimeno 
"A Framework for Eliciting, Incorporating, and Disciplining Identification Beliefs in Linear Models", Third Version  
The identification of causal effects in linear models relies, explicitly and implicitly, on the imposition of researcher beliefs along several dimensions. Assumptions about measurement error, regressor endogeneity, and instrument validity are three key components of any such empirical exercise. Although in practice researchers reason about these three dimensions independently, we show that measurement error, regressor endogeneity and instrument invalidity are mutually constrained by each other and the data in a manner that is only apparent by writing down the full identified set for the model. The nature of this set makes it clear that researcher beliefs over these objects cannot and indeed should not be independent: there are fewer degrees of freedom than parameters. By failing to take this into account, applied researchers both leave money on the table  by failing to incorporate relevant information in estimation  and more importantly risk reasoning to a contradiction by expressing mutually incompatible beliefs. We propose a Bayesian framework to help researchers elicit their beliefs, explicitly incorporate them into estimation and ensure that they are mutually coherent. We illustrate the practical usefulness of our method by applying it to several wellknown papers from the empirical microeconomics literature. Download Paper


15029 
Behrang KamaliShahdadi 
"Matching with Moral Hazard: Assigning Attorneys to Indigent Defendants"  
Each year, over a hundred thousand defendants who are too poor to pay for a lawyer are assigned counsel. Existing procedures for making such assignments are essentially random and have been criticized for giving indigent defendants no say in choosing the counsel they are assigned to. In this paper, we model the problem of assigning counsel to indigent defendants as a matching problem. A novel aspect of this matching problem is the moral hazard component on the part of counsel. Within the model, we show that holding the total expenditure for counsel fixed and changing the matching procedure to accommodate defendants' and attorneys' preferences will make defendants worse off. More precisely, if we switch from random matching to stable matching, defendants become worse off because stable matching exacerbates the moral hazard problem on the part of counsel. In addition, we find conditions on reservation wages of attorneys under which random matching is the efficient way to allocate defendants to counsel. Download Paper


15028 
Francis J. DiTraglia Camilo GarciaJimeno 
"A Framework for Eliciting, Incorporating, and Disciplining Identification Beliefs in Linear Models," Second Version  
The identification of causal effects in linear models relies, explicitly and implicitly, on the imposition of researcher beliefs along several dimensions. Assumptions about measurement error, regressor endogeneity, and instrument validity are three key components of any such empirical exercise. Although in practice researchers reason about these three dimensions independently, we show that measurement error, regressor endogeneity and instrument invalidity are mutually constrained by each other and the data in a manner that is only apparent by writing down the full identified set for the model. The nature of this set makes it clear that researcher beliefs over these objects cannot and indeed should not be independent: there are fewer degrees of freedom than parameters. By failing to take this into account, applied researchers both leave money on the table  by failing to incorporate relevant information in estimation  and more importantly risk reasoning to a contradiction by expressing mutually incompatible beliefs. We propose a Bayesian framework to help researchers elicit their beliefs, explicitly incorporate them into estimation and ensure that they are mutually coherent. We illustrate the practical usefulness of our method by applying it to several wellknown papers from the empirical microeconomics literature. Download Paper


15027 
Francis J. DiTraglia 
"Using Invalid Instruments on Purpose: Focused Moment Selection and Averaging for GMM", Second Version  
Infinite samples, the use of a slightly endogenous but highly relevant instrument can reduce meansquared error (MSE). Building on this observation, I propose a moment selection criterion for GMM in which moment conditions are chosen based on the MSE of their associated estimators rather than their validity: the focused moment selection criterion (FMSC). I then show how the framework used to derive the FMSC can address the problem of inference postmoment selection. Treating postselection estimators as a special case of momentaveraging, in which estimators based on different moment sets are given datadependent weights, I propose a simulationbased procedure to construct valid confidence intervals for a variety of formal and informal momentselection and averaging procedures. Both the FMSC and confidence interval procedure perform well in simulations. I conclude with an empirical example examining the effect of instrument selection on the estimated relationship between malaria transmission and income. Download Paper


15026A 
Patrick DeJarnette David Dillenberger Daniel Gottlieb Pietro Ortoleva 
"Time Lotteries: Online Appendix"  
This online appendix provides additional proofs, extensions, and all experiment instructions and questionnaire. Download Paper


15026 
Patrick DeJarnette David Dillenberger Daniel Gottlieb Pietro Ortoleva 
"Time Lotteries"  
We study preferences over lotteries that pay a specific prize at uncertain dates. Expected Utility with convex discounting implies that individuals prefer receiving $x in a random date with mean t over receiving $x in t days for sure. Our experiment rejects this prediction. It suggests a link between preferences for payments at certain dates and standard risk aversion. EpsteinZin (1989) preferences accommodate such behavior, and fit the data better than a model with probability weighting. We thus provide another justification for disentangling attitudes toward risk and time, as in EpsteinZin, and suggest new theoretical restrictions on its key parameters. Download Paper


15025 
Mert Demirer Francis X. Diebold Laura Liu Kamil Yilmaz 
“Estimating Global Bank Network Connectedness”  
We use lasso methods to shrink, select and estimate the network linking the publiclytraded subset of the world’s top 150 banks, 20032014. We characterize static network connectedness using fullsample estimation and dynamic network connectedness using rollingwindow estimation. Statistically, we find that global banking connectedness is clearly linked to bank location, not bank assets. Dynamically, we find that global banking connectedness displays both secular and cyclical variation. The secular variation corresponds to gradual increases/decreases during episodes of gradual increases/decreases in global market integration. The cyclical variation corresponds to sharp increases during crises, involving mostly crosscountry, as opposed to withincountry, bank linkages. Download Paper


15024 
Naoki Aizawa Hanming Fang 
"Equilibrium Labor Market Search and Health Insurance Reform, Second Version"  
We present and empirically implement an equilibrium labor market search model where risk averse workers facing medical expenditure shocks are matched with firms making health insurance coverage decisions. Our model delivers a rich set of predictions that can account for a wide variety of phenomenon observed in the data including the correlations among firm sizes, wages, health insurance offering rates, turnover rates and workers' health compositions. We estimate our model by Generalized Method of Moments using a combination of micro datasets including Survey of Income and Program Participation, Medical Expenditure Panel Survey and Robert Wood Johnson Foundation Employer Health Insurance Survey. We use our estimated model to evaluate the equilibrium impact of the 2010 Affordable Care Act (ACA) and find that it would reduce the uninsured rate among the workers in our estimation sample from about 22% in the preACA benchmark economy to less than 4%. We also find that incomebased premium subsidies for health insurance purchases from the exchange play an important role for the sustainability of the ACA; without the premium subsidies, the uninsured rate would be around 18%. In contrast, as long as premium subsidies and health insurance exchanges with community ratings stay intact, ACA without the individual mandate, or without the employer mandate, or without both mandates, could still succeed in reducing the uninsured rates to 7.34%, 4.63% and 9.22% respectively. Download Paper


15023 
Richard P. McLean Andrew Postlewaite 
"A Dynamic Nondirect Implementation Mechanism for Interdependent Value Problems", Second Version  
Much of the literature on mechanism design and implementation uses the revelation principle to restrict attention to direct mechanisms. This is without loss of generality in a well defined sense. It is, however, restrictive if one is concerned with the set of equilibria, if one is concerned about the size of messages that will be sent, or if one is concerned about privacy. We showed in McLean and Postlewaite (2014) that when agents are informationally small, there exist small modifications to VCG mechanisms in interdependent value problems that restore incentive compatibility. We show here how one can construct a twostage mechanism that similarly restores incentive compatibility while improving upon the direct one stage mechanism in terms of privacy and the size of messages that must be sent. The first stage essentially elicits that part of the agents' private information that induces interdependence and reveals it to all agents, transforming the interdependent value problem into a private value problem. The second stage is a VCG mechanism for the now private value problem. Agents typically need to transmit substantially less information in the two stage mechanism than would be necessary for a single stage mechanism. Lastly, the first stage that elicits the part of the agents' private information that induces interdependence can be used to transform certain other interdependent value problems into private value problems. Download Paper


15022 
Aislinn Bohren 
"Informational Herding with Model Misspecification, Second Version"  
This paper demonstrates that a misspecified model of information processing interferes with longrun learning and allows inefficient choices to persist in the face of contradictory public information. I consider an observational learning environment where agents observe a private signal about a hidden state, and some agents observe the actions of their predecessors. Prior actions aggregate multiple sources of correlated information about the state, and agents face an inferential challenge to distinguish between new and redundant information. When individuals significantly overestimate the amount of new information, beliefs about the state become entrenched and incorrect learning may occur. When individuals sufficiently overestimate the amount of redundant information, beliefs are fragile and learning is incomplete. Learning is complete when agents have an approximately correct model of inference, establishing that the correct model is robust to perturbation. These results have important implications for timing, frequency and strength of policy interventions to facilitate learning. Download Paper


15021 
Sarah Baird Aislinn Bohren Craig McIntosh Berk Ozler 
"Designing Experiments to Measure Spillover Effects, Second Version"  
This paper formalizes the design of experiments intended specifically to study spillover effects. By first randomizing the intensity of treatment within clusters and then randomly assigning individual treatment conditional on this clusterlevel intensity, a novel set of treatment effects can be identified. We develop a formal framework for consistent estimation of these effects, and provide explicit expressions for power
calculations. We show that the power to detect average treatment effects declines precisely with the quantity that identifies the novel treatment effects. A demonstration of the technique is provided using a cash transfer program in Malawi. Download Paper


15020 
Ufuk Akcigit William Kerr 
"Growth through Heterogeneous Innovation, Second Version"  
We study how external versus internal innovations promote economic growth through a tractable endogenous growth framework with multiple innovation sizes, multiproduct firms, and entry/exit. Firms invest in external R&D to acquire new product lines and in internal R&D to improve their existing product lines. A baseline model derives the theoretical implications of weaker scaling for external R&D versus internal R&D, and the resulting predictions align with observed empirical regularities for innovative firms. Quantifying a generalized model for the recent U.S. economy using matched Census Bureau and patent data, we observe a modest departure for external R&D from perfect scaling frameworks. Download Paper


15019 
Yuichi Yamamoto 
"Stochastic Games with Hidden States, Second Version"  
This paper studies infinitehorizon stochastic games in which players ob serve payoffs and noisy public information about a hidden state each period. Public randomization is available. We find that, very generally, the feasible and individually rational payoff set is invariant to the initial prior about the state in the limit as the discount factor goes to one. We also provide a re cursive characterization of the equilibrium payoff set and establish the folk theorem. Download Paper


15018 
Francis X. Diebold Frank Schorfheide Minchul Shin 
"RealTime Forecast Evaluation of DSGE Models with Stochastic Volatility"  
Recent work has analyzed the forecasting performance of standard dynamic
stochastic general equilibrium (DSGE) models, but little attention has been given to DSGE models that incorporate nonlinearities in exogenous driving processes. Against that background, we explore whether incorporating stochastic volatility improves DSGE forecasts (point, interval, and density). We examine realtime forecast accuracy for key macroeconomic variables including output growth, inflation, and the policy rate. We find that incorporating stochastic volatility in DSGE models of macroeconomic fundamentals markedly improves their density forecasts, just as incorporating stochastic volatility in models of financial asset returns improves their density forecasts. Download Paper


15017 
Xu Cheng Zhipeng Liao Ruoyao Shi 
"Uniform Asymptotic Risk of Averaging GMM Estimator Robust to Misspecification" Second Version  
This paper studies the averaging generalized method of moments (GMM) estimator that combines a conservative GMM estimator based on valid moment conditions and an aggressive GMM estimator based on both valid and possibly misspecified moment conditions, where the weight is the sample analog of an infeasible optimal weight. It is an alternative to pretest estimators that switch between the conservative and aggressive estimators based on model specification tests. This averaging estimator is robust in the sense that it uniformly dominates the conservative estimator by reducing the risk under any degree of misspecification, whereas the pretest estimators reduce the risk in parts of the parameter space and increase it in other parts.
To establish uniform dominance of one estimator over another, we establish asymptotic theories on uniform approximations of the finitesample risk differences between two estimators. These asymptotic results are developed along drifting sequences of data generating processes (DGPs) that model various degrees of local misspecification as well as global misspecification. Extending seminal results on the JamesStein estimator, the uniform dominance is established in nonGaussian semiparametric nonlinear models. The proposed averaging estimator is applied to estimate the human capital production function in a lifecycle labor supply model. Download Paper


15016 
Zenan Wu Xi Weng 
"Managerial Turnover and Entrenchment"  
We consider a twoperiod model in which the success of the firm depends on the effort of a firstperiod manager (the incumbent) and the ability of a secondperiod manager. At the end of the first period, the board receives a noisy signal of the incumbent manager's ability and decides whether to retain or replace the incumbent manager. We show that the information technology the board has to assess the incumbent manager's ability is an important determinant of the optimal contract and replacement policy. The contract must balance providing incentives for the incumbent manager to exert effort and ensuring that the secondperiod manager is of high ability. We show that severance pay in the contract serves as a costly commitment device to induce effort. Unlike existing models, we identify conditions on the information structure under which both entrenchment and antientrenchment emerge in the optimal contract. Download Paper


15015 
David Dillenberger Uzi Segal 
"Skewed Noise"  
We study the attitude of decision makers to skewed noise. For a binary lottery that yields the better outcome with probability $p$, we identify noise around $p$, with a compound lottery that induces a distribution over the exact value of the probability and has an average value p. We propose and characterize a new notion of skewed distributions, and use a recursive nonexpected utility model to provide conditions under which rejection of symmetric noise implies rejection of skewed to the left noise as well. We demonstrate that rejection of these types of noises does not preclude acceptance of some skewed to the right noise, in agreement with recent experimental evidence. We apply the model to study random allocation problems (onesided matching) and show that it can predict systematic preference for one allocation mechanism over the other, even if the two agree on the overall probability distribution over assignments. The model can also be used to address the phenomenon of ambiguity seeking in the context of decision making under uncertainty. Download Paper


15014 
Ufuk Akcigit Salome Baslandze Stefanie Stantcheva 
"Taxation and the International Mobility of Inventors"  
This paper studies the effect of top tax rates on inventors' mobility since 1977. We put special emphasis on "superstar" inventors, those with the most and most valuable
patents. We use panel data on inventors from the United States and European Patent Offices to track inventors' locations over time and combine it with international effective
top tax rate data. We construct a detailed set of proxies for inventors' counterfactual incomes in each possible destination country including, among others, measures of patent quality and technological fit with each potential destination. We find that superstar top 1% inventors are significantly affected by top tax rates when deciding where to locate. The elasticity of the number of domestic inventors to the netoftax rate is relatively small, between 0.04 and 0.06, while the elasticity of the number of foreign inventors is much larger, around 1.3. The elasticities to top netoftax rates decline as one moves down the quality distribution of inventors. Inventors who work in multinational companies are more likely to take advantage of tax differentials. On the other hand, if the company of an inventor has a higher share of its research activity in a given country, the inventor is less sensitive to the tax rate in that country. Download Paper


15013 
Itzhak Gilboa Andrew Postlewaite David Schmeidler 
"Consumer Choice as Constrained Imitation"  
A literal interpretation of neoclassical consumer theory suggests that the consumer solves a very complex problem. In the presence of indivisible goods, the consumer problem is NPHard, and it appears unlikely that it can be optimally solved by humans. An alternative approach is suggested, according to which the household chooses how to allocate its budget among product categories without necessarily being compatible with utility maximization. Rather, the household has a set of constraints, and among these it chooses an allocation in a casebased manner, influenced by choices of other, similar households, or of itself in the past. We offer an axiomatization of this model. Download Paper


15012 
George J. Mailath Andrew Postlewaite Larry Samuelson 
"Buying Locally"  
“Buy local” arrangements encourage members of a community or group to patronize one another rather than the external economy. They range from formal mechanisms such as local currencies to informal “I’ll buy from you if you buy from me” arrangements, and are often championed on social or environmental grounds. We show that in a monopolistically competitive economy, buy local arrangements can have salutary effects even for selfish agents immune to social or environmental considerations. Buy local arrangements effectively allow firms to exploit the equilibrium pricecost gap to profitably expand their sales at the going price. Download Paper


15011 
Richard P. McLean Andrew Postlewaite 
"Informational size and twostage mechanisms"  
We showed in McLean and Postlewaite (2014) that when agents are informationally small, there exist small modifications to VCG mechanisms in interdependent value problems that restore incentive compatibility. This paper presents a twostage mechanism that similarly restores incentive compatibility. The first stage essentially elicits that part of the agents’ private information that induces interdependence and reveals it to all agents, transforming the interdependent value problem into a private value problem. The second stage is a VCG mechanism for the now private value problem. Agents typically need to transmit substantially less information in the two stage mechanism than would be necessary for a single stage mechanism. Lastly, the firrst stage that elicits the part of the agents’ private information that induces interdependence can be used to transform certain other interdependent value problems into private value problems. Download Paper


15010 
Audrey Hu Steven Matthews Liang Zou 
" English Auctions with Ensuing Risks and Heterogeneous Bidders"  
We establish conditions under which an English auction for an indivisible risky asset has an efficient ex post equilibrium when the bidders are heterogeneous in both their exposures to, and their attitudes toward, the ensuing risk the asset will generate for the winning bidder. Each bidder's privately known type is unidimensional, but may affect both his risk attitude and the expected value of the asset's return to the winner. An ex post equilibrium in which the winning bidder has the largest willingness to pay for the asset exists if two conditions hold: each bidder's marginal utility of income is logsupermodular, and the vectorvalued function mapping the type vector into the bidders' expected values for the asset satisfies a weighted average crossing condition. However, this equilibrium need not be efficient. We show that it is efficient if each bidder's expected value for the asset is nonincreasing in the types of the other bidders, or if the bidders exhibit nonincreasing absolute risk aversion, or if the asset is riskless. Download Paper


15009 
David Dillenberger Andrew Postlewaite Kareen Rozen 
"Optimism and Pessimism with Expected Utility, Fifth Version"  
Maximizing subjective expected utility is the classic model of decision making under uncertainty. Savage (1954) provides axioms on preference over acts that are equivalent to the existence of a subjective expected utility representation, and further establishes that such a representation is essentially unique. We show that there is a continuum of other \expected utility" representations in which the probability distributions over states used to evaluate acts depend on the set of possible outcomes of the act and suggest that these alternate representations can capture pessimism or optimism. We then extend the DM's preferences to be defined over both subjective acts and objective lotteries, allowing for sourcedependent preferences. Our result permits modeling ambiguity aversion in Ellsberg's twourn experiment using a single utility function and pessimistic probability assessments over prizes for lotteries and acts, while maintaining the axioms of Savage and von NeumannMorganstern on the appropriate domains. Download Paper


15008 
George J. Mailath Volker Nocke Lucy White 
"When and How the Punishment Must Fit the Crime"  
In repeated normalform (simultaneousmove) games, simple penal codes (Abreu, 1986, 1988) permit an elegant characterization of the set of subgameperfect outcomes. We show that the logic of simple penal codes fails in repeated extensiveform games. By means of examples, we identify two types of settings in which a subgameperfect outcome may be supported only by a profile with the property that the continuation play after a deviation is tailored not only to the identity of the deviator, but also to the nature of the deviation. Download Paper


15007 
Yuichi Yamamoto 
"Stochastic Games with Hidden States"  
This paper studies infinitehorizon stochastic games in which players observe noisy public information about a hidden state each period. We find that if the game is connected, the limit feasible payoff set exists and is invariant to the initial prior about the state. Building on this invariance result, we provide a recursive characterization of the equilibrium payoff set and establish the folk theorem. We also show that connectedness can be replaced with an even weaker condition, called asymptotic connectedness. Asymptotic connectedness is satisfied for generic signal distributions, if the state evolution is irreducible. Download Paper


15006 
Olivier Compte Andrew Postlewaite 
"Plausible Cooperation," Fourth Version  
There is a large repeated games literature illustrating how future interactions provide incentives for cooperation. Much of the earlier literature assumes public monitoring. Departures from public monitoring to private monitoring that incorporate differences in players’ observations may dramatically complicate coordination and the provision of incentives, with the consequence that equilibria with private monitoring often seem unrealistically complex or fragile. We set out a model in which players accomplish cooperation in an intuitively plausible fashion. Players process information via a mental system — a set of psychological states and a transition function between states depending on observations. Players restrict attention to a relatively small set of simple strategies, and consequently, might learn which perform well. Download Paper


15005 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson 
"Memory Utility"  
People often consume nondurable goods in a way that seems inconsistent with preferences for smoothing consumption over time. We suggest that such patterns of consumption can be better explained if one takes into account the memories that consumption generates. A memorable good, such as a honeymoon or a vacation, is a good whose mental consumption outlives its physical consumption. We consider a model in which a consumer enjoys physical consumption as well as memories. Memories are generated only by some goods, and only when their consumption exceeds customary levels by a sufficient margin. We offer axiomatic foundations for the structure of the utility function and study optimal consumption in a dynamic model. The model shows how rational consumers, taking into account their future memories, would make optimal choices that rationalize lumpy patterns of consumption. Download Paper


15004 
Rong Hai Dirk Krueger Andrew Postlewaite 
"On the Welfare Cost of Consumption Fluctuations in the Presence of Memorable Goods," Second Version  
We propose a new category of consumption goods, memorable goods, that generate a flow of utility after consumption. We analyze an otherwise standard consumption model that distinguishes memorable goods from other nondurable goods. Consumers optimally choose lumpy consumption of memorable goods. We empirically document differences between levels and volatilities of memorable and other goods expenditures. Memorable goods expenditures are about twice durable goods expenditures and half the volatility. The welfare cost of consumption fluctuations driven by income shocks are overstated if memorable goods are not accounted for and estimates of excess sensitivity of consumption might be due to memorable goods. Download Paper


15003 
Guido Menzio Nicholas Trachter 
"Equilibrium Price Dispersion Across and Within Stores"  
We develop a searchtheoretic model of the product market that generates price dispersion across and within stores. Buyers differ with respect to their ability to shop around, both at different stores and at different times. The fact that some buyers can shop from only one seller while others can shop from multiple sellers causes price dispersion across stores. The fact that the buyers who can shop from multiple sellers are more likely to be able to shop at inconvenient times induces causes price dispersion within stores. Specifically, it causes sellers to post different prices for the same good at different times in order to discriminate between different types of buyers. Download Paper


15002 
Zehao Hu 
"Financing Innovation with Unobserved Progress"  
This paper studies the problem of incentivizing an agent in an innovation project when the progress of innovation is known only to the agent. I assume the success of innovation requires an intermediate breakthrough and a final breakthrough, with only the latter being observed by the principal. Two properties of optimal contracts are identified. First, conditional on the total level of financing, optimal contracts induce efficient actions from the agent. Second, the reward for success to the agent is in general nonmonotone in success time and later success may be rewarded more. The latter property is consistent with the use of timevested equity as part of compensation schemes for entrepreneurs. I then extend the model by introducing randomly arriving buyers and apply it to study the financing of startup firms with opportunities to be acquired. I show that the potential acquisition increases the cost of providing incentives. Since an agent with low level of progress is “bailed out” when an offer is made to acquire firms with both high and low levels of progress, the agent has more incentive to shirk. In response, the principal reduces the likelihood that the firm with high level of progress is sold. Moreover, the total financing provided by the principal is less compared to the environment without buyers. Download Paper


15001 
Selman Erol Rakesh Vohra 
"Network Formation and Systemic Risk, Second Version"  
This paper introduces a model of endogenous network formation and systemic risk. In it, strategic agents form networks that efficiently tradeoff the possibility of systemic risk with the benefits of trade. Efficiency is a consequence of the high risk of contagion which forces agents to endogenize their externalities. Second, fundamentally ‘safer’ economies generate much higher interconnectedness, which in turn leads to higher systemic risk. Third, the structure of the network formed depends crucially on whether the shocks to the system are believed to be correlated or independent of each other. This underlines the importance of specifying the shock structure before investigating a given network as a particular network and shock structure could be incompatible. Download Paper


14045 
Francis J. DiTraglia 
"Using Invalid Instruments on Purpose: Focused Moment Selection and Averaging for GMM", Second Version  
In finite samples, the use of a slightly endogenous but highly relevant instrument can reduce meansquared error (MSE). Building on this observation, I propose a moment selection criterion for GMM in which moment conditions are chosen based on the MSE of their associated estimators rather than their validity: the focused moment selection criterion (FMSC). I then show how the framework used to derive the FMSC can address the problem of inference postmoment selection. Treating postselection estimators as a special case of momentaveraging, in which estimators based on different moment sets are given datadependent weights, I propose a simulationbased procedure to construct valid confidence intervals for a variety of formal and informal momentselection and averaging procedures. Both the FMSC and confidence interval procedure perform well in simulations. I conclude with an empirical example examining the effect of instrument selection on the estimated relationship between malaria transmission and income. Download Paper


14044 
Daron Acemoglu Ufuk Akcigit Douglas Hanley William Kerr 
"Transition to Clean Technology"  
We develop a microeconomic model of endogenous growth where clean and dirty technologies complete in production and innovation  in the sense that research can be directed to either clean or dirty technologies. If dirty technologies are more advanced to start with, the potential transition to clean technology can be difficult both because clean research must climb several rungs to catch up with dirty technology and because this gap discourages research effort directed towards clean technologies.
Carbon taxes and research subsidies may nonetheless encourage production and innovation in clean technologies, though the transition will typically be slow. We characterize certain general properties of the transition path from dirty to clean technology. We then estimate the model using a combination of regression analysis on the relationship between R&D and patents, and simulated method of moments using microdata on employment, production, R&D, firm growth, entry and exit from the US energy
sector. The model’s quantitative implications match a range of moments not targeted in the estimation quite well. We then characterize the optimal policy path implied by the model and our estimates. Optimal policy makes heavy use of research subsidies as well as carbon taxes. We use the model to evaluate the welfare consequences of a range of alternative policies. Download Paper
