Paper # Author Title
It is well-known that the ability of the Vickrey-Clarke-Groves (VCG) mechanism to implement efficient outcomes for private value choice problems does not extend to interdependent value problems. When an agent’s type affects other agents’ utilities, it may not be incentive compatible for him to truthfully reveal his type when faced with CGV payments. We show that when agents are informationally small, there exist small modifications to CGV that restore incentive compatibility. We further show that truthful revelation is an approximate ex post equilibrium. Lastly, we show that in replicated settings aggregate payments sufficient to induce truthful revelation go to zero. Download Paper
This paper evaluates the impact of three different performance incentives schemes using data from a social experiment that randomized 88 Mexican high schools with over 40,000 students into three treatment groups and a control group. Treatment one provides individual incentives for performance on curriculum-based mathematics tests to students only, treatment two to teachers only and treatment three gives both individual and group incentives to students, teachers and school administrators. Program impact estimates reveal the largest average effects for treatment three, smaller impacts for treatment one and no impact for treatment two. Download Paper
I investigate Big Data, the phenomenon, the term, and the discipline, with emphasis on origins of the term, in industry and academics, in computer science and statistics/econometrics. Big Data the phenomenon continues unabated, Big Data the term is now firmly entrenched, and Big Data the discipline is emerging. Download Paper
We present and empirically implement an equilibrium labor market search model where risk averse workers facing medical expenditure shocks are matched with firms making health insurance coverage decisions. Our model delivers a rich set of predictions that can account for a wide variety of phenomenon observed in the data including the correlations among firm sizes, wages, health insurance offering rates, turnover rates and workers' health compositions. We estimate our model by Generalized Method of Moments using a combination of micro data sources including Survey of Income and Program Participation (SIPP), Medical Expenditure Panel Survey (MEPS) and Robert Wood Johnson Foundation Employer Health Insurance Survey. We use our estimated model to evaluate the equilibrium impact of the 2010 Affordable Care Act (ACA) and find that it would reduce the uninsured rate among the workers in our estimation sample from 20.12% to 7.27%. We also examine a variety of alternative policies to understand the roles of different components of the ACA in contributing to these equilibrium changes. Interestingly, we find that the uninsured rate will be even lower (at 6.44%) if the employer mandate in the ACA is eliminated. Download Paper
Savage (1954) provides axioms on preferences over acts that are equivalent to the existence of a subjective expected utility representation. We show that there is a continuum of other \expected utility" representations in which for any act, the probability distribution over states depends on the corresponding outcomes and is first-order stochastically dominated by (dominates) the Savage distribution. We suggest that pessimism (optimism) can be captured by the stake-dependent probabilities in these alternate representations. We then extend the DM's preferences to be defined over both subjective acts and objective lotteries. Our result permits modeling ambiguity aversion in Ellsberg's two-urn experiment using pessimistic probability assessments, the same utility over prizes for lotteries and acts, and without relaxing Savage's axioms. An implication of our results is that the large body of existing research based on expected utility can, with a simple reinterpretation, be understood as modeling the behavior of optimistic or pessimistic decision makers. Download Paper
We propose a novel theory of self-fulfilling fluctuations in the labor market. A firm employing an additional worker generates positive externalities on other firms, because employed workers have more income to spend and have less time to shop for low prices than unemployed workers. We quantify these shopping externalities and show that they are sufficiently strong to create strategic complementarities in the employment decisions of different firms and to generate multiple rational expectations equilibria. Equilibria differ with respect to the agents’ (rational) expectations about future unemployment. We show that negative shocks to agents’ expectations lead to fluctuations in vacancies, unemployment, labor productivity and the stock market that closely resemble those observed in the US during the Great Recession. Download Paper
This paper constructs a dynamic model of health insurance to evaluate the short- and long run effects of policies that prevent firms from conditioning wages on health conditions of their workers, and that prevent health insurance companies from charging individuals with adverse health conditions higher insurance premia. Our study is motivated by recent US legislation that has tightened regulations on wage discrimination against workers with poorer health status (Americans with Disability Act of 2009, ADA, and ADA Amendments Act of 2008, ADAAA) and that will prohibit health insurance companies from charging different premiums for workers of different health status starting in 2014 (Patient Protection and Affordable Care Act, PPACA). In the model, a trade-off arises between the static gains from better insurance against poor health induced by these policies and their adverse dynamic incentive effects on household efforts to lead a healthy life. Using household panel data from the PSID we estimate and calibrate the model and then use it to evaluate the static and dynamic consequences of no-wage discrimination and no-prior conditions laws for the evolution of the cross-sectional health and consumption distribution of a cohort of households, as well as ex-ante lifetime utility of a typical member of this cohort. In our quantitative analysis we find that although a combination of both policies is effective in providing full consumption insurance period by period, it is suboptimal to introduce both policies jointly since such policy innovation induces a more rapid deterioration of the cohort health distribution over time. This is due to the fact that combination of both laws severely undermines the incentives to lead healthier lives. The resulting negative effects on health outcomes in society more than offset the static gains from better consumption insurance so that expected discounted lifetime utility is lower under both policies, relative to only implementing wage nondiscrimination legislation. Download Paper
This paper considers forecast combination with factor-augmented regression. In this framework, a large number of forecasting models are available, varying by the choice of factors and the number of lags. We investigate forecast combination using weights that minimize the Mallows and the leave-h-out cross validation criteria. The unobserved factor regressors are estimated by principle components of a large panel with N predictors over T periods. With these generated regressors, we show that the Mallows and leave-h-out cross validation criteria are approximately unbiased estimators of the one-step-ahead and multi-step-ahead mean squared forecast errors, respectively, provided that N, T —› ∞. In contrast to well-known results in the literature, the generated-regressor issue can be ignored for forecast combination, without restrictions on the relation between N and T. Simulations show that the Mallows model averaging and leave-h-out cross-validation averaging methods yield lower mean squared forecast errors than alternative model selection and averaging methods such as AIC, BIC, cross validation, and Bayesian model averaging. We apply the proposed methods to the U.S. macroeconomic data set in Stock and Watson (2012) and find that they compare favorably to many popular shrinkage-type forecasting methods. Download Paper
This paper considers the selection of valid and relevant moments for the generalized method of moments (GMM) estimation. For applications with many candidate moments, our asymptotic analysis ccommodates a diverging number of moments as the sample size increases. The proposed procedure achieves three objectives in one-step: (i) the valid and relevant moments are selected simultaneously rather than sequentially; (ii) all desired moments are selected together instead of in a stepwise manner; (iii) the parameter of interest is automatically estimated with all selected moments as opposed to a post-selection estimation. The new moment selection method is achieved via an information-based adaptive GMM shrinkage estimation, where an appropriate penalty is attached to the standard GMM criterion to link moment selection to shrinkage estimation. The penalty is designed to signal both moment validity and relevance for consistent moment selection and efficient estimation. The asymptotic analysis allows for non-smooth sample moments and weakly dependent observations, making it generally applicable. For practical implementation, this one-step procedure is computationally attractive. Download Paper
We investigate whether two players in a long-run relationship can maintain cooperation when the details of the underlying game are unknown. Specifically, we consider a new class of repeated games with private monitoring, where an unobservable state of the world influences the payoff functions and/or the monitoring structure. Each player privately learns the state over time, but cannot observe what the opponent learns. We show that there are robust equilibria where players eventually obtain payoffs as if the true state were common knowledge and players played a “belief-free” equilibrium. The result is applied to various examples, including secret pricecutting with unknown demand. Download Paper
We study perfect information games with an infinite horizon played by an arbitrary number of players. This class of games includes infinitely repeated perfect information games, repeated games with asynchronous moves, games with long and short run players, games with overlapping generations of players, and canonical non-cooperative models of bargaining. We consider two restrictions on equilibria. An equilibrium is purifiable if close by behavior is consistent with equilibrium when agents' payoffs at each node are perturbed additively and independently. An equilibrium has bounded recall if there exists K such that at most one player's strategy depends on what happened more than K periods earlier. We show that only Markov equilibria have bounded memory and are purifiable. Thus if a game has at most one long-run player, all purifiable equilibria are Markov. Download Paper
We formulate a notion of stable outcomes in matching problems with one-sided asymmetric information. The key conceptual problem is to formulate a notion of a blocking pair that takes account of the inferences that the uninformed agent might make from the hypothesis that the current allocation is stable. We show that the set of stable outcomes is nonempty in incomplete information environments, and is a superset of the set of complete-information stable outcomes. We then provide sufficient conditions for incomplete-information stable matchings to be efficient. Lastly, we define a notion of price sustainable allocations and show that the set of incomplete- information stable matchings is a subset of the set of such allocations. Download Paper
The European Union was created to promote economic, cultural, and regional prosperity. However, the Global Financial Crisis demonstrates that its economic institutions are flawed. While each sovereign state in the Eurozone forfeits the control of its money supply, the lack of a common fiscal institution allows individual countries to pursue their own political and financial agendas. The on-going economic hardship emphasizes the critical role of economic and political institution ions. This paper analyzes both beneficial and perverse incentives of joining the European Union, discusses the consequences of deficient economic institutions and provides potential solutions towards the alleviation of the crisis. Download Paper
This paper uses the structure of institutional economics to provide an explanation of the recent U.S. financial crisis. Institutional theory suggests that a county’s political, legal, social, and cultural institutions determine and characterize its economy. An institutional perspective of financial crises therefore incorporates unquantifiable aspects of the real world. Different institutions interacted to ignite and fuel the global crisis. A thorough understanding of all of the legal, political, and cultural institution that encompass a society, as well as their role in the market, is needed to explain and avoid the reoccurrences of financial crises. Download Paper
This study investigates the impacts of negative economic shocks on child schooling in households of rural Malawi, one of the poorest countries in Sub-Saharan Africa (SSA). Two waves of household panel data for years 2006 and 2008 from the Malawi Longitudinal Study of Families and Health (MLSFH) are used to examine the impact of negative shocks on child schooling. Both individually-reported and community-level shocks are investigated. A priori the impact of negative shocks on schooling may be negative (if income effects dominate) or positive (if price effects dominate). Also the effects may be larger for measures of idiosyncratic shocks (if there is considerable within-community variation in experiencing shocks) or for aggregate shocks (if community support networks buffer better idiosyncratic than aggregate shocks). Finally there may be gender differences in the relevance for child schooling of shocks reported by men versus those reported by women with, for example, the former having larger effects if resource constraints have strong effects on schooling and if because of gender roles men perceive better than women shocks that affect household resources. The study finds that negative economic shocks have significant negative impacts on child school enrollment and grade attainment, with the estimated effects of the community shocks larger and more pervasive than the estimated effects of idiosyncratic shocks and with the estimated effects of shocks reported by men as large or larger than the estimated effects of shocks reported by women. Download Paper
Despite women’s significant improvement in educational attainment, underrepresentation of women in Science, Technology, Engineering, and Mathematics (STEM) college majors persists in most countries. We address whether one particular institution – single-sex schools – may enhance female – or male – students’ STEM careers. Exploiting the unique setting in Korea where assignment to all-girls, all-boys or coeducational high schools is random, we move beyond associations to assess causal effects of single-sex schools. We use administrative data on national college entrance mathematics examination scores and a longitudinal survey of high school seniors that provide various STEM outcomes (mathematics and science interest and self-efficacy, expectations of a four-year college attendance and a STEM college major during the high school senior year, and actual attendance at a four-year college and choice of a STEM major two years after high school). We find significantly positive effects of all-boys schools consistently across different STEM outcomes, whereas the positive effect of all-girls schools is only found for mathematics scores. Download Paper
I investigate the origins of the now-ubiquitous term ”Big Data," in industry and academics, in computer science and statistics/econometrics. Credit for coining the term must be shared. In particular, John Mashey and others at Silicon Graphics produced highly relevant (unpublished, non-academic) work in the mid-1990s. The first significant academic references (independent of each other and of Silicon Graphics) appear to be Weiss and Indurkhya (1998) in computer science and Diebold (2000) in statistics /econometrics. Douglas Laney of Gartner also produced insightful work (again unpublished and non-academic) slightly later. Big Data the term is now firmly entrenched, Big Data the phenomenon continues unabated, and Big Data the discipline is emerging. Download Paper
We study an individual who faces a dynamic decision problem in which the process of information arrival is unobserved by the analyst, and hence should be identified from observed choice data. An information structure is objectively describable if signals correspond to events of the objective state space. We derive a representation of preferences over menus of acts that captures the behavior of a Bayesian decision maker who expects to receive such signals. The class of information structures that can support such a representation generalizes the notion of a partition of the state space. The representation allows us to compare individuals in terms of the preciseness of their information structures without requiring that they share the same prior beliefs. We apply the model to study an individual who anticipates gradual resolution of uncertainty over time. Both the filtration (the timing of information arrival with the sequence of partitions it induces) and prior beliefs are uniquely identified. Download Paper
The Diebold-Mariano (DM) test was intended for comparing forecasts; it has been, and remains, useful in that regard. The DM test was not intended for comparing models. Unfortunately, however, much of the large subsequent literature uses DM-type tests for comparing models, in (pseudo-) out-of-sample environments. In that case, much simpler yet more compelling full-sample model comparison procedures exist; they have been, and should continue to be, widely used. The hunch that (pseudo-) out-of-sample analysis is somehow the “only," or “best," or even a “good" way to provide insurance against in-sample over fitting in model comparisons proves largely false. On the other hand, (pseudo-) out-of-sample analysis may be useful for learning about comparative historical predictive performance. Download Paper
We study an individual who faces a dynamic decision problem in which the process of information arrival is unobserved by the analyst. We derive two utility representations of preferences over menus of acts that capture the individual’s uncertainty about his future beliefs. The most general representation identifies a unique probability distribution over the set of posteriors that the decision maker might face at the time of choosing from the menu. We use this representation to characterize a notion of “more preference for flexibility” via a subjective analogue of Blackwell’s (1951, 1953) comparisons of experiments. A more specialized representation uniquely identifies information as a partition of the state space. This result allows us to compare individuals who expect to learn differently, even if they do not agree on their prior beliefs. We conclude by extending the basic model to accommodate an individual who expects to learn gradually over time by means of a subjective filtration. Download Paper
This paper studies optimal Ramsey taxation when risk sharing in private insurance markets is imperfect due to limited enforcement. In a limited commitment economy, there are externalities associated with capital and labor because individuals do not take into account that their labor and saving decisions affect aggregate supply, wages and thus the value of autarky. Due to these externalities, the Ramsey government has an additional goal, which is to internalize the externalities of labor and capital to improve risk sharing, in addition to its usual goal - minimizing distortions when financing government expenditures. These two goals drive capital and labor taxes in opposite directions. By balancing these conflicting goals, the steady-state optimal capital income taxes are levied only to remove the negative externality of the capital, and optimal labor income taxes are set to meet the budgetary needs of the government in the long run, despite positive externalities of labor. Download Paper
A large literature uses matching models to analyze markets with two-sided heterogeneity, studying problems such as the matching of students to schools, residents to hospitals, husbands to wives, and workers to firms. The analysis typically assumes that the agents have complete information, and examines core outcomes. We formulate a notion of stable outcomes in matching problems with one-sided asymmetric information. The key conceptual problem is to formulate a notion of a blocking pair that takes account of the inferences that the uninformed agent might make from the hypothesis that the current allocation is stable. We show that the set of stable outcomes is nonempty in incomplete information environments, and is a superset of the set of complete-information stable outcomes. We provide sufficient conditions for incomplete-information stable matchings to be efficient. Download Paper
Savage (1954) provided axioms on preferences over acts that were equivalent to the existence of an expected utility representation. We show that there is a continuum of other expected utility" representations in which for any act, the probability distribution over states depends on the corresponding outcomes. We suggest that optimism and pessimism can be captured by the stake-dependent probabilities in these alternative representations. Extending the DM's preferences to be defined on both subjective acts and objective lotteries, we suggest how one may distinguish optimists from pessimists and separate attitude towards uncertainty from curvature of the utility function over monetary prizes. Download Paper
People often wonder why economists analyze models whose assumptions are known to be false, while economists feel that they learn a great deal from such exercises. We suggest that part of the knowledge generated by academic economists is case-based rather than rule-based. That is, instead of offering general rules or theories that should be contrasted with data, economists often analyze models that are “theoretical cases”, which help understand economic problems by drawing analogies between the model and the problem. According to this view, economic models, empirical data, experimental results and other sources of knowledge are all on equal footing, that is, they all provide cases to which a given problem can be compared. We offer complexity arguments that explain why case-based reasoning may sometimes be the method of choice; why economists prefer simple examples; and why a paradigm may be useful even if it does not produce theories. Download Paper
We propose a model of history-dependent risk attitude, allowing a decision maker’s risk attitude to be affected by his history of disappointments and elations. The decision maker recursively evaluates compound risks, classifying realizations as disappointing or elating using a threshold rule. We establish equivalence between the model and two cognitive biases: risk attitudes are reinforced by experiences (one is more risk averse after disappointment than after elation) and there is a primacy effect (early outcomes have the greatest impact on risk attitude). In dynamic asset pricing, the model yields volatile, path-dependent prices. Download Paper
We develop a model of a Parole Board contemplating whether to grant parole release to a prisoner who has finished serving their minimum sentence. The model implies a simple outcome test for racial prejudice robust to the inframarginality problem. Our test involves running simple regressions of whether a prisoner recidivates on the exposure time to the risk of recidivism and its square, using only the sample of prisoners who are granted parole release strictly between their minimum and maximum sentences and separately by race. If the coefficient estimates on the exposure time term differ by race, then there is evidence of racial prejudice against the racial group with the smaller coefficient estimate. We implement our test for prejudice using data from Pennsylvania from January 1996 to December 31, 2001. Although we find racial differences in time served, we find no evidence for racial prejudice on the part of the Parole Board based on our outcome test. Download Paper
Consider an agent who is unsure of the state of the world and faces computational bounds on mental processing. The agent receives a sequence of signals imperfectly correlated with the true state that he will use to take a single decision. The agent is assumed to have a finite number of "states of mind" that quantify his beliefs about the relative likelihood of the states, and uses the signals he receives to move from one state to another. At a random stopping time, the agent will be called upon to make a decision based solely on his mental state at that time. We show that under quite general conditions it is optimal that the agent ignore signals that are not very informative, that is, signals for which the likelihood of the states is nearly equal. This model provides a possible explanation of systematic inference mistakes people may make. Download Paper
In Medicare Part D, low income individuals receive subsidies to enroll into insurance plans. This paper studies how premiums are distorted by the combined effects of this subsidy and the default assignment of low income enrollees into plans. Removing this distortion could reduce the cost of the program without worsening consumers' welfare. Using data from the the first five years of the program, an econometric model is used to estimate consumers demand for plans and to compute what premiums would be without the subsidy distortion. Preliminary estimates suggest that the reduction in premiums of affected plans would be substantial. Download Paper
Cross-sectional productivity dispersion is countercyclical, at the plant level and at the firm level. I incorporate a firm’s project choice decision into a firm dynamics model with business cycle features to explain this empirical finding both qualitatively and quantitatively. In particular, all projects available have the same expected flow return and differ from one another only in the riskiness level. The endogenous option of exiting the market and limited funding for new investment jointly play an important role in motivating firms’ risk-taking behavior. The model predicts that relatively small firms are more likely to take risk and that the cross-sectional productivity dispersion, measured as the variance/standard deviation of firm-level profitability, is larger in recessions. Download Paper
We consider the impact of job rotation in a directed search model in which firm sizes are endogenously determined and match quality is initially unknown. A large firm benefits from the opportunity of rotating workers so as to partially overcome loss of mismatch. As a result, in the unique symmetric equilibrium, large firms have higher labor productivity and lower separation rates. In contrast to the standard directed search model with multi-vacancy firms, this model can generate a positive correlation between firm size and wage without introducing any ex ante productivity differences or imposing any non-concave production function assumption. Download Paper
We present a dynamic signaling model where wasteful education takes place over several periods of time. Workers pay an education cost per unit of time and cannot commit to a fixed education length. Workers face an exogenous dropout risk before graduation. Since low-productivity workers' cost is high, pooling with early dropouts helps them to avoid a high education cost. In equilibrium, low-productivity workers choose to endogenously drop out over time, so the productivity of workers in college increases along the education process. We find that (1) wasteful education signals exist even when job offers are privately made and the length of the period is small, (2) the maximum education length is decreasing in the prior about a worker being highly productive, and (3) the joint dynamics of returns to education and the dropout rate are characterized, which is consistent with previous empirical evidence. Download Paper
This paper studies the optimal redistribution of income inequality caused by the presence of search and matching frictions in the labor market. We study this problem in the context of a directed search model of the labor market populated by homogenous workers and heterogeneous firms. The optimal redistribution can be attained using a positive unemployment benefit and an increasing and regressive labor income tax. The positive unemployment benefit serves the purpose of lowering the search risk faced by workers. The increasing and regressive labor tax serves the purpose of aligning the cost to the firm of attracting an additional applicant with the value of an application to society. Download Paper
Machina (2009, 2012) lists a number of situations where standard models of ambiguity aversion are unable to capture plausible features of ambiguity attitudes. Most of these problems arise in choice over prospects involving three or more outcomes. We show that the recursive non-expected utility model of Segal (1987) is rich enough to accommodate all these situations. Download Paper
We propose and illustrate a Markov-switching multi-fractal duration (MSMD) model for analysis of inter-trade durations in financial markets. We establish several of its key properties with emphasis on high persistence (indeed long memory). Empirical exploration suggests MSMD's superiority relative to leading competitors. Download Paper
We consider all-pay auctions in the presence of interdependent, affiliated valuations and private budget constraints. For the sealed-bid, all-pay auction we characterize a symmetric equilibrium in continuous strategies for the case of N bidders and we investigate its properties. Budget constraints encourage more aggressive bidding among participants with large endowments and intermediate valuations. We extend our results to the war of attrition where we show that budget constraints lead to a uniform amplification of equilibrium bids among bidders with sufficient endownments. An example shows that with both interdependent valuations and private budget constraints, a revenue ranking between the two mechanisms is generally not possible. Download Paper