Working Papers
By Year:
Paper #  Author  Title  

15019 
Yuichi Yamamoto 
"Stochastic Games with Hidden States, Second Version"  
This paper studies infinitehorizon stochastic games in which players ob serve payoffs and noisy public information about a hidden state each period. Public randomization is available. We find that, very generally, the feasible and individually rational payoff set is invariant to the initial prior about the state in the limit as the discount factor goes to one. We also provide a re cursive characterization of the equilibrium payoff set and establish the folk theorem. Download Paper


15018 
Francis X. Diebold Frank Schorfheide Minchul Shin 
"RealTime Forecast Evaluation of DSGE Models with Stochastic Volatility"  
Recent work has analyzed the forecasting performance of standard dynamic
stochastic general equilibrium (DSGE) models, but little attention has been given to DSGE models that incorporate nonlinearities in exogenous driving processes. Against that background, we explore whether incorporating stochastic volatility improves DSGE forecasts (point, interval, and density). We examine realtime forecast accuracy for key macroeconomic variables including output growth, inflation, and the policy rate. We find that incorporating stochastic volatility in DSGE models of macroeconomic fundamentals markedly improves their density forecasts, just as incorporating stochastic volatility in models of financial asset returns improves their density forecasts. Download Paper


15017 
Xu Cheng Zhipeng Liao Ruoyao Shi 
"Uniform Asymptotic Risk of Averaging GMM Estimator Robust to Misspecification" Second Version  
This paper studies the averaging generalized method of moments (GMM) estimator that combines a conservative GMM estimator based on valid moment conditions and an aggressive GMM estimator based on both valid and possibly misspecified moment conditions, where the weight is the sample analog of an infeasible optimal weight. It is an alternative to pretest estimators that switch between the conservative and aggressive estimators based on model specification tests. This averaging estimator is robust in the sense that it uniformly dominates the conservative estimator by reducing the risk under any degree of misspecification, whereas the pretest estimators reduce the risk in parts of the parameter space and increase it in other parts.
To establish uniform dominance of one estimator over another, we establish asymptotic theories on uniform approximations of the finitesample risk differences between two estimators. These asymptotic results are developed along drifting sequences of data generating processes (DGPs) that model various degrees of local misspecification as well as global misspecification. Extending seminal results on the JamesStein estimator, the uniform dominance is established in nonGaussian semiparametric nonlinear models. The proposed averaging estimator is applied to estimate the human capital production function in a lifecycle labor supply model. Download Paper


15016 
Zenan Wu Xi Weng 
"Managerial Turnover and Entrenchment"  
We consider a twoperiod model in which the success of the firm depends on the effort of a firstperiod manager (the incumbent) and the ability of a secondperiod manager. At the end of the first period, the board receives a noisy signal of the incumbent manager's ability and decides whether to retain or replace the incumbent manager. We show that the information technology the board has to assess the incumbent manager's ability is an important determinant of the optimal contract and replacement policy. The contract must balance providing incentives for the incumbent manager to exert effort and ensuring that the secondperiod manager is of high ability. We show that severance pay in the contract serves as a costly commitment device to induce effort. Unlike existing models, we identify conditions on the information structure under which both entrenchment and antientrenchment emerge in the optimal contract. Download Paper


15015 
David Dillenberger Uzi Segal 
"Skewed Noise"  
We study the attitude of decision makers to skewed noise. For a binary lottery that yields the better outcome with probability $p$, we identify noise around $p$, with a compound lottery that induces a distribution over the exact value of the probability and has an average value p. We propose and characterize a new notion of skewed distributions, and use a recursive nonexpected utility model to provide conditions under which rejection of symmetric noise implies rejection of skewed to the left noise as well. We demonstrate that rejection of these types of noises does not preclude acceptance of some skewed to the right noise, in agreement with recent experimental evidence. We apply the model to study random allocation problems (onesided matching) and show that it can predict systematic preference for one allocation mechanism over the other, even if the two agree on the overall probability distribution over assignments. The model can also be used to address the phenomenon of ambiguity seeking in the context of decision making under uncertainty. Download Paper


15014 
Ufuk Akcigit Salome Baslandze Stefanie Stantcheva 
"Taxation and the International Mobility of Inventors"  
This paper studies the effect of top tax rates on inventors' mobility since 1977. We put special emphasis on "superstar" inventors, those with the most and most valuable
patents. We use panel data on inventors from the United States and European Patent Offices to track inventors' locations over time and combine it with international effective
top tax rate data. We construct a detailed set of proxies for inventors' counterfactual incomes in each possible destination country including, among others, measures of patent quality and technological fit with each potential destination. We find that superstar top 1% inventors are significantly affected by top tax rates when deciding where to locate. The elasticity of the number of domestic inventors to the netoftax rate is relatively small, between 0.04 and 0.06, while the elasticity of the number of foreign inventors is much larger, around 1.3. The elasticities to top netoftax rates decline as one moves down the quality distribution of inventors. Inventors who work in multinational companies are more likely to take advantage of tax differentials. On the other hand, if the company of an inventor has a higher share of its research activity in a given country, the inventor is less sensitive to the tax rate in that country. Download Paper


15013 
Itzhak Gilboa Andrew Postlewaite David Schmeidler 
"Consumer Choice as Constrained Imitation"  
A literal interpretation of neoclassical consumer theory suggests that the consumer solves a very complex problem. In the presence of indivisible goods, the consumer problem is NPHard, and it appears unlikely that it can be optimally solved by humans. An alternative approach is suggested, according to which the household chooses how to allocate its budget among product categories without necessarily being compatible with utility maximization. Rather, the household has a set of constraints, and among these it chooses an allocation in a casebased manner, influenced by choices of other, similar households, or of itself in the past. We offer an axiomatization of this model. Download Paper


15012 
George J. Mailath Andrew Postlewaite Larry Samuelson 
"Buying Locally"  
“Buy local” arrangements encourage members of a community or group to patronize one another rather than the external economy. They range from formal mechanisms such as local currencies to informal “I’ll buy from you if you buy from me” arrangements, and are often championed on social or environmental grounds. We show that in a monopolistically competitive economy, buy local arrangements can have salutary effects even for selfish agents immune to social or environmental considerations. Buy local arrangements effectively allow firms to exploit the equilibrium pricecost gap to profitably expand their sales at the going price. Download Paper


15011 
Richard P. McLean Andrew Postlewaite 
"Informational size and twostage mechanisms"  
We showed in McLean and Postlewaite (2014) that when agents are informationally small, there exist small modifications to VCG mechanisms in interdependent value problems that restore incentive compatibility. This paper presents a twostage mechanism that similarly restores incentive compatibility. The first stage essentially elicits that part of the agents’ private information that induces interdependence and reveals it to all agents, transforming the interdependent value problem into a private value problem. The second stage is a VCG mechanism for the now private value problem. Agents typically need to transmit substantially less information in the two stage mechanism than would be necessary for a single stage mechanism. Lastly, the firrst stage that elicits the part of the agents’ private information that induces interdependence can be used to transform certain other interdependent value problems into private value problems. Download Paper


15010 
Audrey Hu Steven Matthews Liang Zou 
" English Auctions with Ensuing Risks and Heterogeneous Bidders"  
We establish conditions under which an English auction for an indivisible risky asset has an efficient ex post equilibrium when the bidders are heterogeneous in both their exposures to, and their attitudes toward, the ensuing risk the asset will generate for the winning bidder. Each bidder's privately known type is unidimensional, but may affect both his risk attitude and the expected value of the asset's return to the winner. An ex post equilibrium in which the winning bidder has the largest willingness to pay for the asset exists if two conditions hold: each bidder's marginal utility of income is logsupermodular, and the vectorvalued function mapping the type vector into the bidders' expected values for the asset satisfies a weighted average crossing condition. However, this equilibrium need not be efficient. We show that it is efficient if each bidder's expected value for the asset is nonincreasing in the types of the other bidders, or if the bidders exhibit nonincreasing absolute risk aversion, or if the asset is riskless. Download Paper


15009 
David Dillenberger Andrew Postlewaite Kareen Rozen 
"Optimism and Pessimism with Expected Utility, Fifth Version"  
Maximizing subjective expected utility is the classic model of decision making under uncertainty. Savage (1954) provides axioms on preference over acts that are equivalent to the existence of a subjective expected utility representation, and further establishes that such a representation is essentially unique. We show that there is a continuum of other \expected utility" representations in which the probability distributions over states used to evaluate acts depend on the set of possible outcomes of the act and suggest that these alternate representations can capture pessimism or optimism. We then extend the DM's preferences to be defined over both subjective acts and objective lotteries, allowing for sourcedependent preferences. Our result permits modeling ambiguity aversion in Ellsberg's twourn experiment using a single utility function and pessimistic probability assessments over prizes for lotteries and acts, while maintaining the axioms of Savage and von NeumannMorganstern on the appropriate domains. Download Paper


15008 
George J. Mailath Volker Nocke Lucy White 
"When and How the Punishment Must Fit the Crime"  
In repeated normalform (simultaneousmove) games, simple penal codes (Abreu, 1986, 1988) permit an elegant characterization of the set of subgameperfect outcomes. We show that the logic of simple penal codes fails in repeated extensiveform games. By means of examples, we identify two types of settings in which a subgameperfect outcome may be supported only by a profile with the property that the continuation play after a deviation is tailored not only to the identity of the deviator, but also to the nature of the deviation. Download Paper


15007 
Yuichi Yamamoto 
"Stochastic Games with Hidden States"  
This paper studies infinitehorizon stochastic games in which players observe noisy public information about a hidden state each period. We find that if the game is connected, the limit feasible payoff set exists and is invariant to the initial prior about the state. Building on this invariance result, we provide a recursive characterization of the equilibrium payoff set and establish the folk theorem. We also show that connectedness can be replaced with an even weaker condition, called asymptotic connectedness. Asymptotic connectedness is satisfied for generic signal distributions, if the state evolution is irreducible. Download Paper


15006 
Olivier Compte Andrew Postlewaite 
"Plausible Cooperation," Fourth Version  
There is a large repeated games literature illustrating how future interactions provide incentives for cooperation. Much of the earlier literature assumes public monitoring. Departures from public monitoring to private monitoring that incorporate differences in players’ observations may dramatically complicate coordination and the provision of incentives, with the consequence that equilibria with private monitoring often seem unrealistically complex or fragile. We set out a model in which players accomplish cooperation in an intuitively plausible fashion. Players process information via a mental system — a set of psychological states and a transition function between states depending on observations. Players restrict attention to a relatively small set of simple strategies, and consequently, might learn which perform well. Download Paper


15005 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson 
"Memory Utility"  
People often consume nondurable goods in a way that seems inconsistent with preferences for smoothing consumption over time. We suggest that such patterns of consumption can be better explained if one takes into account the memories that consumption generates. A memorable good, such as a honeymoon or a vacation, is a good whose mental consumption outlives its physical consumption. We consider a model in which a consumer enjoys physical consumption as well as memories. Memories are generated only by some goods, and only when their consumption exceeds customary levels by a sufficient margin. We offer axiomatic foundations for the structure of the utility function and study optimal consumption in a dynamic model. The model shows how rational consumers, taking into account their future memories, would make optimal choices that rationalize lumpy patterns of consumption. Download Paper


15004 
Rong Hai Dirk Krueger Andrew Postlewaite 
"On the Welfare Cost of Consumption Fluctuations in the Presence of Memorable Goods," Second Version  
We propose a new category of consumption goods, memorable goods, that generate a flow of utility after consumption. We analyze an otherwise standard consumption model that distinguishes memorable goods from other nondurable goods. Consumers optimally choose lumpy consumption of memorable goods. We empirically document differences between levels and volatilities of memorable and other goods expenditures. Memorable goods expenditures are about twice durable goods expenditures and half the volatility. The welfare cost of consumption fluctuations driven by income shocks are overstated if memorable goods are not accounted for and estimates of excess sensitivity of consumption might be due to memorable goods. Download Paper


15003 
Guido Menzio Nicholas Trachter 
"Equilibrium Price Dispersion Across and Within Stores"  
We develop a searchtheoretic model of the product market that generates price dispersion across and within stores. Buyers differ with respect to their ability to shop around, both at different stores and at different times. The fact that some buyers can shop from only one seller while others can shop from multiple sellers causes price dispersion across stores. The fact that the buyers who can shop from multiple sellers are more likely to be able to shop at inconvenient times induces causes price dispersion within stores. Specifically, it causes sellers to post different prices for the same good at different times in order to discriminate between different types of buyers. Download Paper


15002 
Zehao Hu 
"Financing Innovation with Unobserved Progress"  
This paper studies the problem of incentivizing an agent in an innovation project when the progress of innovation is known only to the agent. I assume the success of innovation requires an intermediate breakthrough and a final breakthrough, with only the latter being observed by the principal. Two properties of optimal contracts are identified. First, conditional on the total level of financing, optimal contracts induce efficient actions from the agent. Second, the reward for success to the agent is in general nonmonotone in success time and later success may be rewarded more. The latter property is consistent with the use of timevested equity as part of compensation schemes for entrepreneurs. I then extend the model by introducing randomly arriving buyers and apply it to study the financing of startup firms with opportunities to be acquired. I show that the potential acquisition increases the cost of providing incentives. Since an agent with low level of progress is “bailed out” when an offer is made to acquire firms with both high and low levels of progress, the agent has more incentive to shirk. In response, the principal reduces the likelihood that the firm with high level of progress is sold. Moreover, the total financing provided by the principal is less compared to the environment without buyers. Download Paper


15001 
Selman Erol Rakesh Vohra 
"Network Formation and Systemic Risk, Second Version"  
This paper introduces a model of endogenous network formation and systemic risk. In it, strategic agents form networks that efficiently tradeoff the possibility of systemic risk with the benefits of trade. Efficiency is a consequence of the high risk of contagion which forces agents to endogenize their externalities. Second, fundamentally ‘safer’ economies generate much higher interconnectedness, which in turn leads to higher systemic risk. Third, the structure of the network formed depends crucially on whether the shocks to the system are believed to be correlated or independent of each other. This underlines the importance of specifying the shock structure before investigating a given network as a particular network and shock structure could be incompatible. Download Paper


14045 
Francis J. DiTraglia 
"Using Invalid Instruments on Purpose: Focused Moment Selection and Averaging for GMM", Second Version  
In finite samples, the use of a slightly endogenous but highly relevant instrument can reduce meansquared error (MSE). Building on this observation, I propose a moment selection criterion for GMM in which moment conditions are chosen based on the MSE of their associated estimators rather than their validity: the focused moment selection criterion (FMSC). I then show how the framework used to derive the FMSC can address the problem of inference postmoment selection. Treating postselection estimators as a special case of momentaveraging, in which estimators based on different moment sets are given datadependent weights, I propose a simulationbased procedure to construct valid confidence intervals for a variety of formal and informal momentselection and averaging procedures. Both the FMSC and confidence interval procedure perform well in simulations. I conclude with an empirical example examining the effect of instrument selection on the estimated relationship between malaria transmission and income. Download Paper


14044 
Daron Acemoglu Ufuk Akcigit Douglas Hanley William Kerr 
"Transition to Clean Technology"  
We develop a microeconomic model of endogenous growth where clean and dirty technologies complete in production and innovation  in the sense that research can be directed to either clean or dirty technologies. If dirty technologies are more advanced to start with, the potential transition to clean technology can be difficult both because clean research must climb several rungs to catch up with dirty technology and because this gap discourages research effort directed towards clean technologies.
Carbon taxes and research subsidies may nonetheless encourage production and innovation in clean technologies, though the transition will typically be slow. We characterize certain general properties of the transition path from dirty to clean technology. We then estimate the model using a combination of regression analysis on the relationship between R&D and patents, and simulated method of moments using microdata on employment, production, R&D, firm growth, entry and exit from the US energy
sector. The model’s quantitative implications match a range of moments not targeted in the estimation quite well. We then characterize the optimal policy path implied by the model and our estimates. Optimal policy makes heavy use of research subsidies as well as carbon taxes. We use the model to evaluate the welfare consequences of a range of alternative policies. Download Paper


14043 
Sina T. Ates Felipe E. Saffie 
"Fewer but Better: Sudden Stops, Firm Entry, and Financial Selection"  
We combine the real business cycle small open economy framework with the endogenous growth literature to study the productivity cost of a sudden stop. In this economy, productivity growth is determined by successful implementation of business ideas, yet the quality of ideas is heterogeneous and good ideas are scarce. A representative financial intermediary screens and selects the most promising ideas, which gives rise to a tradeoff between mass (quantity) and composition (quality) in the entrant cohort. Chilean plantlevel data from the sudden stop triggered by the Russian sovereign default in 1998 confirms the main mechanism of the model, as firms born during the credit shortage are fewer, but better. A calibrated version of the economy shows the importance of accounting for heterogeneity and selection, as otherwise the permanent loss of output generated by the forgone entrants doubles, which increases the welfare cost by 30%. Download Paper


14042 
Joseph E. Harrington, Jr. Yanhao Wei 
"What Can the Duration of Discovered Cartels Tell Us About the Duration of Cartels?"  
There are many data sets based on the population of discovered cartels and it is from this data that average cartel duration and the annual probability of cartel death are estimated. It is recognized, however, that these estimates could be biased because the population of discovered cartels may not be a representative sample of the population of cartels. This paper constructs a simple birthdeathdiscovery process to theoretically investigate what it is we can learn about cartels from data on discovered cartels. Download Paper


14041 
Yanhao Wei 
"Network Effects of Air Travel Demand" Second Version  
As demand increases, airline carriers often increase flight frequencies to meet the larger flow of passengers in their networks, which reduces passengers' schedule delays and attracts more demand. Focusing on the “network effects", this paper develops and estimates a structural model of the U.S. airline industry. Compared with previous studies, the model implies higher cost estimates, which seem more consistent with the unprofitability of the industry; belowmarginalcost pricing becomes possible and appears on many routes. I also study airline mergers and find that the network effects can be the main factor underlying their profitability. Download Paper


14040 
SeulKi Shin 
"Preferences vs. Opportunities: Racial/Ethnic Intermarriage in the United States"  
This paper develops and implements a new approach for separately identifying preference and opportunity parameters of a twosided search and matching model in the absence of data on choice sets. This approach exploits information on the dynamics of matches: how long it takes for singles to form matches, what types of matches they form, and how long the matches last. Willingness to accept a certain type of partner can be revealed through the dissolution of matches. Given recovered acceptance rules, the rates at which singles meet different types are inferred from the observed transitions from singlehood to matches. Imposing equilibrium conditions links acceptance rules and arrival rates to underlying preference and opportunity parameters. Using the Panel Study of Income Dynamics, I apply this method to examine the marriage patterns of nonHispanic whites, nonHispanic blacks and Hispanics in the United States. Results indicate that the observed infrequency of intermarriage is primarily attributable to a low incidence of interracial/interethnic meetings rather than samerace/ethnicity preferences. Simulations based on the estimated model show the effects of demographic changes on marital patterns. Download Paper


14039 
Dirk Krueger Hans A. Holter Serhiy Stepanchuk 
"How Does Tax Progressivity and Household Heterogeneity Affect Laffer Curves?"  
How much additional tax revenue can the government generate by increasing labor income taxes? In this paper we provide a quantitative answer to this question, and study the importance of the progressivity of the tax schedule for the ability of the government to generate tax revenues. We develop a rich overlapping generations model featuring an explicit family structure, extensive and intensive margins of labor supply, endogenous accumulation of labor market experience as well as standard intertemporal consumptionsavings choices in the presence of uninsurable idiosyncratic labor productivity risk. We calibrate the model to US macro, micro and tax data and characterize the labor income tax Laffer curve under the current choice of the progressivity of the labor income tax code as well as when varying progressivity. We find that more progressive labor income taxes significantly reduce tax revenues. For the US, converting to a flat tax code raises the peak of the Laffer curve by 6%, whereas converting to a tax system with progressivity similar to Denmark, would lower the peak by 7%. We also show that, relative to a representative agent economy tax revenues are less sensitive to the progressivity of the tax code in our economy. This finding is due to the fact that labor supply of two earner households is less elastic (along the intensive margin) and the endogenous accumulation of labor market experience makes labor supply of females less elastic (around the extensive margin) to changes in tax progressivity. Download Paper


14038 
Francis X. Diebold Minchul Shin 
"Assessing Point Forecast Accuracy by Stochastic Error Distance"  
We propose point forecast accuracy measures based directly on distance of the forecasterror c.d.f. from the unit step function at 0 (\stochastic error distance," or SED). We provide a precise characterization of the relationship between SED and standard predictive loss functions, showing that all such loss functions can be written as weighted SED's. The leading case is absoluteerror loss, in which the SED weights are unity, establishing its primacy. Among other things, this suggests shifting attention away from conditionalmean forecasts and toward conditionalmedian forecasts. Download Paper


14037 
Francis J. DiTraglia 
"Using Invalid Instruments on Purpose: Focused Moment Selection and Averaging for GMM"  
In finite samples, the use of a slightly endogenous but highly relevant instrument can reduce meansquared error (MSE). Building on this observation, I propose a moment selection criterion for GMM in which moment conditions are chosen based on the MSE of their associated estimators rather than their validity: the focused moment selection criterion (FMSC). I then show how the framework used to derive the FMSC can address the problem of inference postmoment selection. Treating postselection estimators as a special case of momentaveraging, in which estimators based on different moment sets are given datadependent weights, I propose a simulationbased procedure to construct valid confidence intervals for a variety of formal and informal momentselection procedures. Both the FMSC and confidence interval procedure perform well in simulations. I conclude with an empirical example examining the effect of instrument selection on the estimated relationship between malaria transmission and percapita income. Download Paper


14036 
Fabian Kindermann Dirk Krueger 
"High Marginal Tax Rates on the Top 1%? Lessons from a Life Cycle Model with Idiosyncratic Income Risk"  
In this paper we argue that very high marginal labor income tax rates are an effective tool for social insurance even when households have preferences with high labor supply elasticity, make dynamic savings decisions, and policies have general equilibrium effects. To make this point we construct a large scale Overlapping Generations Model with uninsurable labor productivity risk, show that it has a wealth distribution that matches the data well, and then use it to characterize fiscal policies that achieve a desired degree of redistribution in society. We find that marginal tax rates on the top 1% of the earnings distribution of close to 90% are optimal. We document that this result is robust to plausible variation in the labor supply elasticity and holds regardless of whether social welfare is measured at the steady state only or includes transitional generations. Download Paper


14035 
S. Boragan Aruoba Pablo Cuba Borda Frank Schorfheide 
"Macroeconomic Dynamics Near the ZLB: A Tale of Two Countries"  
We propose and solve a smallscale NewKeynesian model with Markov sunspot shocks that move the economy between a targetedinflation regime and a deflation regime and fit it to data from the U.S. and Japan. For the U.S. we find that adverse demand shocks have moved the economy to the zero lower bound (ZLB) in 2009 and an expansive monetary policy has kept it there subsequently. In contrast, Japan has experienced a switch to the deflation regime in 1999 and remained there since then, except for a short period. The two scenarios have drastically different implications for macroeconomic policies. Fiscal multipliers are about 20% smaller in the deflationary regime, despite the economy remaining at the ZLB. While a commitment by the central bank to keep rates near the ZLB doubles the fiscal multipliers in the targetedinflation regime (U.S.), it has no effect in the deflation regime (Japan). Download Paper


14034 
Marco Del Negro Raiden Hasegawa Frank Schorfheide 
"Dynamic Prediction Pools: An Investigation of Financial Frictions and Forecasting Performance"  
We provide a novel methodology for estimating timevarying weights in linear prediction pools, which we call Dynamic Pools, and use it to investigate the relative forecasting performance of DSGE models with and without financial frictions for output growth and inflation from 1992 to 2011. We find strong evidence of time variation in the pool's weights, reflecting the fact that the DSGE model with financial frictions produces superior forecasts in periods of financial distress but does not perform as well in tranquil periods. The dynamic pool's weights react in a timely fashion to changes in the environment, leading to realtime forecast improvements relative to other methods of density forecast combination, such as Bayesian Model Averaging, optimal (static) pools, and equal weights. We show how a policymaker dealing with model uncertainty could have used a dynamic pools to perform a counterfactual exercise (responding to the gap in labor market conditions) in the immediate aftermath of the Lehman crisis. Download Paper


14033 
Aislinn Bohren 
"Stochastic Games in Continuous Time: Persistent Actions in LongRun Relationships", Second Version  
This paper studies a class of continuoustime stochastic games in which the actions of a longrun player have a persistent effect on payoffs. For example, the quality of a firm's product depends on past as well as current effort, or the level of a policy instrument depends on a government's past choices. The longrun player faces a population of small players, and its actions are imperfectly observed. I establish the existence of Markov equilibria, characterize the Perfect Public Equilibria (PPE) payoffset as the convex hull of the Markov Equilibria payoff set, and identify conditions for the uniqueness of a Markov equilibrium in the class of all PPE. The existence proof is constructive: it characterizes the explicit form of Markov equilibria payoffs and actions, for any discount rate. Action persistence creates a crucial new channel to generate intertemporal incentives in a setting where traditional channels fail, and can provide permanent nontrivial incentives in many settings. These results offer a novel framework for thinking about reputational dynamics of firms, governments, and other longrun agents. Download Paper


14032 
Sarah Baird Aislinn Bohren Craig McIntosh Berk Ozler 
"Designing Experiments to Measure Spillover Effects"  
This paper formalizes the design of experiments intended specifically to study spillover effects. By first randomizing the intensity of treatment within clusters and then randomly assigning individual treatment conditional on this clusterlevel intensity, a novel set of treatment effects can be identified. We develop a formal framework for consistent estimation of these effects, and provide explicit expressions for power calculations. We show that the power to detect average treatment effects declines precisely with the quantity that identifies the novel treatment effects. A demonstration of the technique is provided using a cash transfer program in Malawi. Download Paper


14031 
Rebecca M. Stein Gloria Allione 
"Mass attrition: An analysis of drop out from a Principles of Microeconomics MOOC"  
Though Mass Open Online Courses are very different from each other in their structure, content or audience, they are all characterized by low completion rates. In this paper, we use the Cox proportional hazard model to analyze student retention in the MOOC Principle of Microeconomics. Using two different measures of retention, video watching and quiz submission, we show that students’ commitment to the course can be strongly predicted by their participation in the first week’s activities. Data collected through a voluntary optin survey allow us to study retention in relation to demographics, for a subset of enrollees. We find a higher dropout rate for college students or younger participants. Female attrition is larger when measured by video watching but not when measured by quiz completion. Selfascribed motivation for taking the course is not a predictor of completion. We conclude that raw completion rates cannot be the criterion to judge the success of MOOCs, as it is in case of traditional courses. The results are consistent with the existing literature which separates MOOCs students into two different groups: Committed Learners and Browsers. Download Paper


14030 
Thanh Nguyen Ahmad Peivandi Rakesh Vohra 
"OneSided Matching with Limited Complementarities"  
The problem of allocating bundles of indivisible objects without transfers arises in the assignment of courses to students, of computing resources like CPU time, memory and disk space to computing tasks and the truck loads of food to food banks. In these settings the complementarities in preferences are small compared with the size of the market. We exploit this to design mechanisms satisfying efficiency, envyfreeness and asymptotic strategyproofness
Informally, we assume that agents do not want bundles that are too large. There will be a parameter k such that the marginal utility of any item relative to a bundle of size k or larger is zero. We call such preferences kdemand preferences. Given this parameter we show how to represent probability shares over bundles as lotteries over approximately (deterministic) feasible integer allocations. The degree of infeasibility in these integer allocations will be controlled by the parameter k. In particular, expost, no good is over allocated by at most k 1 units. Download Paper
