Working Papers
By Year:
Paper #  Author  Title  

13008 
Olivier Compte Andrew Postlewaite 
“Plausible Cooperation”, Third Version  
There is a large repeated games literature illustrating how future interactions provide incentives for cooperation. Much of the earlier literature assumes public monitoring: players always observe precisely the same thing. Departures from public monitoring to private monitoring that incorporate differences in players’ observations may dramatically complicate coordination and the provision of incentives, with the consequence that equilibria with private monitoring often seem unrealistically complex. We set out a model in which players accomplish cooperation in an intuitively plausible fashion. Players process information via a mental system — a set of psychological states and a transition function between states depending on observations. Players restrict attention to a relatively small set of simple strategies, and consequently, might learn which perform well. Download Paper


13007 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson David Schmeidler 
"Economic Models as Analogies", Third Version  
People often wonder why economists analyze models whose assumptions are known to be false, while economists feel that they learn a great deal from such exercises. We suggest that part of the knowledge generated by academic economists is casebased rather than rulebased. That is, instead of offering general rules or theories that should be contrasted with data, economists often analyze models that are "theoretical cases", which help understand economic problems by drawing analogies between the model and the problem. According to this view, economic models, empirical data, experimental results and other sources of knowledge are all on equal footing, that is, they all provide cases to which a given problem can be compared. We offer complexity arguments that explain why casebased reasoning may sometimes be the method of choice and why economists prefer simple cases. Download Paper


13006 
Antonio M. Merlo Francois OrtaloMagne John Rust 
'The Home Selling Problem: Theory and Evidence'  
This paper formulates and solves the problem of a homeowner who wants to sell her house for the maximum possible price net of transactions costs (including real estate commissions). The optimal selling strategy consists of an initial list price with subsequent weekly decisions on how much to adjust the list price until the home is sold or withdrawn from the market. The solution also yields a sequence of reservation prices that determine whether the homeowner should accept offers from potential buyers who arrive stochastically over time with an expected arrival rate that is a decreasing function of the list price. We estimate the model using a rich data set of complete transaction histories for 780 residential properties in England introduced by Merlo and OrtaloMagné (2004). For each home in the sample, the data include all listing price changes and all offers made on the home between initial listing and the final sale agreement. The estimated model fits observed list price dynamics and other key features of the data well. In particular, we show that a very small “menu cost” of changing the listing price
(estimated to equal 10 thousandths of 1% of the house value, or approximately £10 for a home worth £100,000) is sufficient to explain the high degree of “stickiness” of listing prices observed in the data. Download Paper


13005 
Richard P. McLean Andrew Postlewaite 
"Implementation with Interdependent Valuations", Second Version  
It is wellknown that the ability of the VickreyClarkeGroves (VCG) mechanism to implement efficient outcomes for private value choice problems does not extend to interdependent value problems. When an agent’s type affects other agents’ utilities, it may not be incentive compatible for him to truthfully reveal his type when faced with CGV payments. We show that when agents are informationally small, there exist small modifications to CGV that restore incentive compatibility. We further show that truthful revelation is an approximate ex post equilibrium. Lastly, we show that in replicated settings aggregate payments sufficient to induce truthful revelation go to zero. Download Paper


13004 
Jere R. Behrman Susan W. Parker Petra E. Todd Kenneth I. Wolpin 
"Aligning Learning Incentives of Students and Teachers: Results from a Social Experiment in Mexican High Schools"  
This paper evaluates the impact of three different performance incentives schemes using data from a social experiment that randomized 88 Mexican high schools with over 40,000 students into three treatment groups and a control group. Treatment one provides individual incentives for performance on curriculumbased mathematics tests to students only, treatment two to teachers only and treatment three gives both individual and group incentives to students, teachers and school administrators. Program impact estimates reveal the largest average effects for treatment three, smaller impacts for treatment one and no impact for treatment two. Download Paper


13003 
Francis X. Diebold 
“A Personal Perspective on the Origin(s) and Development of “Big Data": The Phenomenon, the Term, and the Discipline”, Second Version  
I investigate Big Data, the phenomenon, the term, and the discipline, with emphasis on origins of the term, in industry and academics, in computer science and statistics/econometrics. Big Data the phenomenon continues unabated, Big Data the term is now firmly entrenched, and Big Data the discipline is emerging. Download Paper


13002 
Naoki Aizawa Hanming Fang 
"Equilibrium Labor Market Search and Health Insurance Reform"  
We present and empirically implement an equilibrium labor market search model where risk averse workers facing medical expenditure shocks are matched with firms making health insurance coverage decisions. Our model delivers a rich set of predictions that can account for a wide variety of phenomenon observed in the data including the correlations among firm sizes, wages, health insurance offering rates, turnover rates and workers' health compositions. We estimate our model by Generalized Method of Moments using a combination of micro data sources including Survey of Income and Program Participation (SIPP), Medical Expenditure Panel Survey (MEPS) and Robert Wood Johnson Foundation Employer Health Insurance Survey. We use our estimated model to evaluate the equilibrium impact of the 2010 Affordable Care Act (ACA) and find that it would reduce the uninsured rate among the workers in our estimation sample from 20.12% to 7.27%. We also examine a variety of alternative policies to understand the roles of different components of the ACA in contributing to these equilibrium changes. Interestingly, we find that the uninsured rate will be even lower (at 6.44%) if the employer mandate in the ACA is eliminated. Download Paper


13001 
David Dillenberger Andrew Postlewaite Kareen Rozen 
“Optimism and Pessimism with Expected Utility”, Third Version  
Savage (1954) provides axioms on preferences over acts that are equivalent to the existence of a subjective expected utility representation. We show that there is a continuum of other \expected utility" representations in which for any act, the probability distribution over states depends on the corresponding outcomes and is firstorder stochastically dominated by (dominates) the Savage distribution. We suggest that pessimism (optimism) can be captured by the stakedependent probabilities in these alternate representations. We then extend the DM's preferences to be defined over both subjective acts and objective lotteries. Our result permits modeling ambiguity aversion in Ellsberg's twourn experiment using pessimistic probability assessments, the same utility over prizes for lotteries and acts, and without relaxing Savage's axioms. An implication of our results is that the large body of existing research based on expected utility can, with a simple reinterpretation, be understood as modeling the behavior of optimistic or pessimistic decision makers. Download Paper


12048 
Guido Menzio 
"Shopping Externalities and SelfFulfilling Unemployment Fluctuations"  
We propose a novel theory of selffulfilling fluctuations in the labor market. A firm employing an additional worker generates positive externalities on other firms, because employed workers have more income to spend and have less time to shop for low prices than unemployed workers. We quantify these shopping externalities and show that they are sufficiently strong to create strategic complementarities in the employment decisions of different firms and to generate multiple rational expectations equilibria. Equilibria differ with respect to the agents’ (rational) expectations about future unemployment. We show that negative shocks to agents’ expectations lead to fluctuations in vacancies, unemployment, labor productivity and the stock market that closely resemble those observed in the US during the Great Recession. Download Paper


12047 
Harold L. Cole Soojin Kim Dirk Krueger 
"Analyzing the Effects of Insuring Health Risks: On the Tradeoff between Short Run Insurance Benefits vs. Long Run Incentive Costs"  
This paper constructs a dynamic model of health insurance to evaluate the short and long run effects of policies that prevent firms from conditioning wages on health conditions of their workers, and that prevent health insurance companies from charging individuals with adverse health conditions higher insurance premia. Our study is motivated by recent US legislation that has tightened regulations on wage
discrimination against workers with poorer health status (Americans with Disability Act of 2009, ADA, and ADA Amendments Act of 2008, ADAAA) and that will prohibit health insurance companies from charging different premiums for workers of different health status starting in 2014 (Patient Protection and Affordable Care Act, PPACA). In the model, a tradeoff arises between the static gains from better insurance against poor health induced by these policies and their adverse dynamic incentive effects on household efforts to lead a healthy life. Using household panel data from the PSID we estimate and calibrate the model and then use it to evaluate the static and dynamic consequences of nowage discrimination and noprior conditions laws for the evolution of the crosssectional health and consumption distribution of a cohort of households, as well as exante lifetime utility of a typical member of this cohort. In our quantitative analysis we find that although a combination of both policies is effective in providing full consumption insurance period by period, it is suboptimal to introduce both policies jointly since such policy innovation induces a more rapid deterioration of the cohort health distribution over time. This is due to the fact that combination of both laws severely undermines the incentives to lead healthier lives. The resulting negative effects on health outcomes in society more than offset the static gains from better consumption insurance so that expected discounted lifetime utility is lower under both
policies, relative to only implementing wage nondiscrimination legislation. Download Paper


12046 
Xu Cheng Bruce E. Hansen 
"Forecasting with FactorAugmented Regression: A Frequentist Model Averaging Approach"  
This paper considers forecast combination with factoraugmented regression. In this framework, a large number of forecasting models are available, varying by the choice of factors and the number of lags. We investigate forecast combination using weights that minimize the Mallows and the leavehout cross validation criteria. The unobserved factor regressors are estimated by principle components of a large panel with N predictors over T periods. With these generated regressors, we show that the Mallows and leavehout cross validation criteria are approximately unbiased estimators of the onestepahead and multistepahead mean squared forecast errors, respectively, provided that N, T —› ∞. In contrast to wellknown results in the literature, the generatedregressor issue can be ignored for forecast combination, without restrictions on the relation between N and T. Simulations show that the Mallows model averaging and leavehout crossvalidation averaging methods yield lower mean squared forecast errors than alternative model selection and averaging methods such as AIC, BIC, cross validation, and Bayesian model averaging. We apply the proposed methods to the U.S. macroeconomic data set in Stock and Watson (2012) and find that they compare favorably to many popular shrinkagetype forecasting methods. Download Paper


12045 
Xu Cheng Zhipeng Liao 
"Select the Valid and Relevant Moments: A OneStep Procedure for GMM with Many Moments"  
This paper considers the selection of valid and relevant moments for the generalized method of moments (GMM) estimation. For applications with many candidate moments, our asymptotic analysis ccommodates a diverging number of moments as the sample size increases. The proposed procedure achieves three objectives in onestep: (i) the valid and relevant moments are selected simultaneously rather than sequentially; (ii) all desired moments are selected together instead of in a stepwise manner; (iii) the parameter of interest is automatically estimated with all selected moments as opposed to a postselection estimation. The new moment selection method is achieved via an informationbased adaptive GMM shrinkage estimation, where an appropriate penalty is attached to the standard GMM criterion to link moment selection to shrinkage estimation. The penalty is designed to signal both moment validity and relevance for consistent moment selection and efficient estimation. The asymptotic analysis allows for nonsmooth sample moments and weakly dependent observations, making it generally applicable.
For practical implementation, this onestep procedure is computationally attractive. Download Paper


12044 
Yuichi Yamamoto 
"Individual Learning and Cooperation in Noisy Repeated Games"  
We investigate whether two players in a longrun relationship can maintain cooperation when the details of the underlying game are unknown. Specifically, we consider a new class of repeated games with private monitoring, where an unobservable state of the world influences the payoff functions and/or the monitoring structure. Each player privately learns the state over time, but cannot observe what the opponent learns. We show that there are robust equilibria where players eventually obtain payoffs as if the true state were common knowledge and players played a “belieffree” equilibrium. The result is applied to various examples, including secret pricecutting with unknown demand. Download Paper


12043 
V. Bhaskar George J. Mailath Stephen Morris 
"A Foundation for Markov Equilibria in Infinite Horizon Perfect Information Games"  
We study perfect information games with an infinite horizon played by an arbitrary number of players. This class of games includes infinitely repeated perfect information games, repeated games with asynchronous moves, games with long and short run players, games with overlapping generations of players, and canonical noncooperative models of bargaining. We consider two restrictions on equilibria. An equilibrium is purifiable if close by behavior is consistent with equilibrium when agents' payoffs at each node are perturbed additively and independently. An equilibrium has bounded recall if there exists K such that at most one player's strategy depends on what happened more than K periods earlier. We show that only Markov equilibria have bounded memory and are purifiable. Thus if a game has at most one longrun player, all purifiable equilibria are Markov. Download Paper


12042 
Qingmin Liu George J. Mailath Andrew Postlewaite Larry Samuelson 
"Stable Matching with Incomplete Information", Second Version  
We formulate a notion of stable outcomes in matching problems with onesided asymmetric information. The key conceptual problem is to formulate a notion of a blocking pair that takes account of the inferences that the uninformed agent might make from the hypothesis that the current allocation is stable. We show that the set of stable outcomes is nonempty in incomplete information environments, and is a superset of the set of completeinformation stable outcomes. We then provide sufficient conditions for incompleteinformation stable matchings to be efficient. Lastly, we define a notion of price sustainable allocations and show that the set of incomplete information stable matchings is a subset of the set of such allocations. Download Paper


12041 
Alojzy Z. Nowak 
"Failing Institutions Are at the Core of the Euro Crisis"  
The European Union was created to promote economic, cultural, and regional prosperity. However, the Global Financial Crisis demonstrates that its economic institutions are flawed. While each sovereign state in the Eurozone forfeits the control of its money supply, the lack of a common fiscal institution allows individual countries to pursue their own political and financial agendas. The ongoing economic hardship emphasizes the critical role of economic and political institution ions. This paper analyzes both beneficial and perverse incentives of joining the European Union, discusses the consequences of deficient economic institutions and provides potential solutions towards the alleviation of the crisis. Download Paper


12040 

"Failing Institutions Are at the Core of the U.S. Financial Crisis"  
This paper uses the structure of institutional economics to provide an explanation of the recent U.S. financial crisis. Institutional theory suggests that a county’s political, legal, social, and cultural institutions determine and characterize its economy. An institutional perspective of financial crises therefore incorporates unquantifiable aspects of the real world. Different institutions interacted to ignite and fuel the global crisis. A thorough understanding of all of the legal, political, and cultural institution that encompass a society, as well as their role in the market, is needed to explain and avoid the reoccurrences of financial crises. Download Paper


12039 
Asma Hyder Jere R. Behrman HansPeter Kohler 
"Negative Economic Shocks and Child Schooling: Evidence from Rural Malawi"  
This study investigates the impacts of negative economic shocks on child schooling in households of rural Malawi, one of the poorest countries in SubSaharan Africa (SSA). Two waves of household panel data for years 2006 and 2008 from the Malawi Longitudinal Study of Families and Health (MLSFH) are used to examine the impact of negative shocks on child schooling. Both individuallyreported and communitylevel shocks are investigated. A priori the impact of negative shocks on schooling may be negative (if income effects dominate) or positive (if price effects dominate). Also the effects may be larger for measures of idiosyncratic shocks (if there is considerable withincommunity variation in experiencing shocks) or for aggregate shocks (if community support networks buffer better idiosyncratic than aggregate shocks). Finally there may be gender differences in the relevance for child schooling of shocks reported by men versus those reported by women with, for example, the former having larger effects if resource constraints have strong effects on schooling and if because of gender roles men perceive better than women shocks that affect household resources. The study finds that negative economic shocks have significant negative impacts on child school enrollment and grade attainment, with the estimated effects of the community shocks larger and more pervasive than the estimated effects of idiosyncratic shocks and with the estimated effects of shocks reported by men as large or larger than the estimated effects of shocks reported by women. Download Paper


12038 
Hyunjoon Park Jere R. Behrman Jaesung Choi 
"Do SingleSex Schools Enhance Students’ STEM (Science, Technology, Engineering, and Mathematics) Outcomes?"  
Despite women’s significant improvement in educational attainment, underrepresentation of women in Science, Technology, Engineering, and Mathematics (STEM) college majors persists in most countries. We address whether one particular institution – singlesex schools – may enhance female – or male – students’ STEM careers. Exploiting the unique setting in Korea where assignment to allgirls, allboys or coeducational high schools is random, we move beyond associations to assess causal effects of singlesex schools. We use administrative data on national college entrance mathematics examination scores and a longitudinal survey of high school seniors that provide various STEM outcomes (mathematics and science interest and selfefficacy, expectations of a fouryear college attendance and a STEM college major during the high school senior year, and actual attendance at a fouryear college and choice of a STEM major two years after high school). We find significantly positive effects of allboys schools consistently across different STEM outcomes, whereas the positive effect of allgirls schools is only found for mathematics scores. Download Paper


12037 
Francis X. Diebold 
"On the Origin(s) and Development of the Term “Big Data"  
I investigate the origins of the nowubiquitous term ”Big Data," in industry and academics, in computer science and statistics/econometrics. Credit for coining the term must be shared. In particular, John Mashey and others at Silicon Graphics produced highly relevant (unpublished, nonacademic) work in the mid1990s. The first significant academic references (independent of each other and of Silicon Graphics) appear to be Weiss and Indurkhya (1998) in computer science and Diebold (2000) in statistics /econometrics. Douglas Laney of Gartner also produced insightful work (again unpublished and nonacademic) slightly later. Big Data the term is now firmly entrenched, Big Data the phenomenon continues unabated, and Big Data the discipline is emerging. Download Paper


12036 
David Dillenberger Philipp Sadowski 
"Generalized Partition and Subjective Filtration"  
We study an individual who faces a dynamic decision problem in which the process of information arrival is unobserved by the analyst, and hence should be identified from observed choice data. An information structure is objectively describable if signals correspond to events of the objective state space. We derive a representation of preferences over menus of acts that captures the behavior of a Bayesian decision maker who expects to receive such signals. The class of information structures that can support such a representation generalizes the notion of a partition of the state space. The representation allows us to compare individuals in terms of the preciseness of their information structures without requiring that they share the same prior beliefs. We apply the model to study an individual who anticipates gradual resolution of uncertainty over time. Both the filtration (the timing of information arrival with the sequence of partitions it induces) and prior beliefs are uniquely identified. Download Paper


12035 
Francis X. Diebold 
"Comparing Predictive Accuracy, Twenty Years Later: A Personal Perspective on the Use and Abuse of DieboldMariano Tests"  
The DieboldMariano (DM) test was intended for comparing forecasts; it has been, and remains, useful in that regard. The DM test was not intended for comparing models. Unfortunately, however, much of the large subsequent literature uses DMtype tests for comparing models, in (pseudo) outofsample environments. In that case, much simpler yet more compelling fullsample model comparison procedures exist; they have been, and should continue to be, widely used. The hunch that (pseudo) outofsample analysis is somehow the “only," or “best," or even a “good" way to provide insurance against insample over fitting in model comparisons proves largely false. On the other hand, (pseudo) outofsample analysis may be useful for learning about comparative historical predictive performance. Download Paper


12034 
David Dillenberger Philipp Sadowski Juan Sebastian Lleras Norio Takeoka 
"A Theory of Subjective Learning"  
We study an individual who faces a dynamic decision problem in which the process of information arrival is unobserved by the analyst. We derive two utility representations of preferences over menus of acts that capture the individual’s uncertainty about his future beliefs. The most general representation identifies a unique probability distribution over the set of posteriors that the decision maker might face at the time of choosing from the menu. We use this representation to characterize a notion of “more preference for flexibility” via a subjective analogue of Blackwell’s (1951, 1953) comparisons of experiments. A more specialized representation uniquely identifies information as a partition of the state space. This result allows us to compare individuals who expect to learn differently, even if they do not agree on their prior beliefs. We conclude by extending the basic model to accommodate an individual who expects to learn gradually over time by means of a subjective filtration. Download Paper


12033 
Yena Park 
"Optimal Taxation in a Limited Commitment Economy"  
This paper studies optimal Ramsey taxation when risk sharing in private insurance markets is imperfect due to limited enforcement. In a limited commitment economy, there are externalities associated with capital and labor because individuals do not take into account that their labor and saving decisions affect aggregate supply, wages
and thus the value of autarky. Due to these externalities, the Ramsey government has an additional goal, which is to internalize the externalities of labor and capital to improve risk sharing, in addition to its usual goal  minimizing distortions when financing government expenditures. These two goals drive capital and labor taxes in
opposite directions. By balancing these conflicting goals, the steadystate optimal capital income taxes are levied only to remove the negative externality of the capital, and optimal labor income taxes are set to meet the budgetary needs of the government in the long run, despite positive externalities of labor. Download Paper


12032 
Qingmin Liu George J. Mailath Andrew Postlewaite Larry Samuelson 
"Matching with Incomplete Information"  
A large literature uses matching models to analyze markets with twosided heterogeneity, studying problems such as the matching of students to schools, residents to hospitals, husbands to wives, and workers to firms. The analysis typically assumes that the agents have complete information, and examines core outcomes. We formulate a notion of stable outcomes in matching problems with onesided asymmetric information. The key conceptual problem is to formulate a notion of a blocking pair that takes account of the inferences that the uninformed agent might make from the hypothesis that the current allocation is stable. We show that the set of stable outcomes is nonempty in incomplete information environments, and is a superset of the set of completeinformation stable outcomes. We provide sufficient conditions for incompleteinformation stable matchings to be efficient. Download Paper


12031 
David Dillenberger Andrew Postlewaite Kareen Rozen 
“Optimism and Pessimism with Expected Utility”, Second Version  
Savage (1954) provided axioms on preferences over acts that were equivalent to the existence of an expected utility representation. We show that there is a continuum of other expected utility" representations in which for any act, the probability distribution over states depends on the corresponding outcomes. We suggest that optimism and pessimism can be captured by the stakedependent probabilities in these alternative representations. Extending the DM's preferences to be defined on both subjective acts and objective lotteries, we suggest how one may distinguish optimists from pessimists and separate attitude towards uncertainty from curvature of the utility function over monetary prizes. Download Paper


12030 
Itzhak Gilboa Andrew Postlewaite Larry Samuelson David Schmeidler 
"Economic Models as Analogies", Second Version  
People often wonder why economists analyze models whose assumptions are known to be false, while economists feel that they learn a great deal from such exercises. We suggest that part of the knowledge generated by academic economists is casebased rather than rulebased. That is, instead of offering general rules or theories that should be contrasted with data, economists often analyze models that are “theoretical cases”, which help understand economic problems by drawing analogies between the model and the problem. According to this view, economic models, empirical data, experimental results and other sources of knowledge are all on equal footing, that is, they all provide cases to which a given problem can be compared. We offer complexity arguments that explain why casebased reasoning may sometimes be the method of choice; why economists prefer simple examples; and why a paradigm may be useful even if it does not produce theories. Download Paper


12029 
David Dillenberger Kareen Rozen 
"HistoryDependent Risk Attitude" Second Version  
We propose a model of historydependent risk attitude, allowing a decision maker’s risk attitude to be affected by his history of disappointments and elations. The decision maker recursively evaluates compound risks, classifying realizations as disappointing or elating using a threshold rule. We establish equivalence between the model and two cognitive biases: risk attitudes are reinforced by experiences (one is more risk averse after disappointment than after elation) and there is a primacy effect (early outcomes have the greatest impact on risk attitude). In dynamic asset pricing, the model yields volatile, pathdependent prices. Download Paper


12028 
Shamena Anwar Hanming Fang 
"Testing for Racial Prejudice in the Parole Board Release Process: Theory and Evidence"  
We develop a model of a Parole Board contemplating whether to grant parole release to a prisoner who has finished serving their minimum sentence. The model implies a simple outcome test for racial prejudice robust to the inframarginality problem. Our test involves running simple regressions of whether a prisoner recidivates on the exposure time to the risk of recidivism and its square, using only the sample of prisoners who are granted parole release strictly between their minimum and maximum sentences and separately by race. If the coefficient estimates on the exposure time term differ by race, then there is evidence of racial prejudice against the racial group with the smaller coefficient estimate. We implement our test for prejudice using data from Pennsylvania from January 1996 to December 31, 2001. Although we find racial differences in time served, we find no evidence for racial prejudice on the part of the Parole Board based on our outcome test. Download Paper


12027 
Olivier Compte Andrew Postlewaite 
"Belief Formation", Second Version  
Consider an agent who is unsure of the state of the world and faces computational bounds on mental processing. The agent receives a sequence of signals imperfectly correlated with the true state that he will use to take a single decision. The agent is assumed to have a finite number of "states of mind" that quantify his beliefs about the relative likelihood of the states, and uses the signals he receives to move from one state to another. At a random stopping time, the agent will be called upon to make a decision based solely on his mental state at that time. We show that under quite general conditions it is optimal that the agent ignore signals that are not very informative, that is, signals for which the likelihood of the states is nearly equal. This model provides a possible explanation of systematic inference mistakes people may make. Download Paper


12026 

"Pricing and Incentives in Publicly Subsidized Health Care Markets: The Case of Medicare Part D"  
In Medicare Part D, low income individuals receive subsidies to enroll into insurance plans. This paper studies how premiums are distorted by the combined effects of this subsidy and the default assignment of low income enrollees into plans. Removing this distortion could reduce the cost of the program without worsening consumers' welfare. Using data from the the first five years of the program, an econometric model is used to estimate consumers demand for plans and to compute what premiums would be without the subsidy distortion. Preliminary estimates suggest that the reduction in premiums of affected plans would be substantial. Download Paper


12025 
Can Tian 
"Riskiness Choice and Endogenous Productivity Dispersion over the Business Cycle"  
Crosssectional productivity dispersion is countercyclical, at the plant level and at the firm level. I incorporate a firm’s project choice decision into a firm dynamics model with business cycle features to explain this empirical finding both qualitatively and quantitatively. In particular, all projects available have the same expected flow return and differ from one another only in the riskiness level. The endogenous option of exiting the market and limited funding for new investment jointly play an important role in motivating firms’ risktaking behavior. The model predicts that relatively small firms are more likely to take risk and that the crosssectional productivity dispersion, measured as the variance/standard deviation of firmlevel profitability, is larger in recessions. Download Paper


12024 
Fei Li Can Tian 
“Directed Search and Job Rotation”  
We consider the impact of job rotation in a directed search model in which firm sizes are endogenously determined and match quality is initially unknown. A large firm benefits from the opportunity of rotating workers so as to partially overcome loss of mismatch. As a result, in the unique symmetric equilibrium, large firms have higher labor productivity and lower separation rates. In contrast to the standard directed search model with multivacancy
firms, this model can generate a positive correlation between firm size and wage without introducing any ex ante productivity differences or imposing any nonconcave production function assumption. Download Paper


12023 
Francesc Dilme Fei Li 
"Dynamic Education Signaling with Dropout"  
We present a dynamic signaling model where wasteful education takes place over several periods of time. Workers pay an education cost per unit of time and cannot commit to a fixed education length. Workers face an exogenous dropout risk before graduation. Since lowproductivity workers' cost is high, pooling with early dropouts helps them to avoid a high education cost. In equilibrium, lowproductivity workers choose to endogenously drop out over time, so the productivity of workers in college increases along the education process. We find that (1) wasteful education signals exist even when job offers are privately made and the length of the period is small, (2) the maximum education length is decreasing in the prior about a worker being highly productive, and (3) the joint dynamics of returns to education and the dropout rate are characterized, which is consistent with previous empirical evidence. Download Paper


12022 
Mikhail Golosov Pricila Maziero Guido Menzio 
"Taxation and Redistribution of Residual Income Inequality"  
This paper studies the optimal redistribution of income inequality caused by the presence of search and matching frictions in the labor market. We study this problem in the context of a directed search model of the labor market populated by homogenous workers and heterogeneous firms. The optimal redistribution can be attained using a positive unemployment benefit and an increasing and regressive labor income tax. The positive unemployment benefit serves the purpose of lowering the search risk faced by workers. The increasing and regressive labor tax serves the purpose of aligning the cost to the firm of attracting an additional applicant with the value of an application to society. Download Paper
