Paper # Author Title
Though Mass Open Online Courses are very different from each other in their structure, content or audience, they are all characterized by low completion rates. In this paper, we use the Cox proportional hazard model to analyze student retention in the MOOC Principle of Microeconomics. Using two different measures of retention, video watching and quiz submission, we show that students’ commitment to the course can be strongly predicted by their participation in the first week’s activities. Data collected through a voluntary opt-in survey allow us to study retention in relation to demographics, for a subset of enrollees. We find a higher dropout rate for college students or younger participants. Female attrition is larger when measured by video watching but not when measured by quiz completion. Self-ascribed motivation for taking the course is not a predictor of completion. We conclude that raw completion rates cannot be the criterion to judge the success of MOOCs, as it is in case of traditional courses. The results are consistent with the existing literature which separates MOOCs students into two different groups: Committed Learners and Browsers. Download Paper
The problem of allocating bundles of indivisible objects without transfers arises in the assignment of courses to students, of computing resources like CPU time, memory and disk space to computing tasks and the truck loads of food to food banks. In these settings the complementarities in preferences are small compared with the size of the market. We exploit this to design mechanisms satisfying efficiency, envy-freeness and asymptotic strategy-proofness Informally, we assume that agents do not want bundles that are too large. There will be a parameter k such that the marginal utility of any item relative to a bundle of size k or larger is zero. We call such preferences k-demand preferences. Given this parameter we show how to represent probability shares over bundles as lotteries over approximately (deterministic) feasible integer allocations. The degree of infeasibility in these integer allocations will be controlled by the parameter k. In particular, ex-post, no good is over allocated by at most k -1 units. Download Paper
This paper introduces a model of endogenous network formation and systemic risk. In it, agents form networks that efficiently trade-off the possibility of systemic risk with the benefits of trade. Second, fundamentally ‘safer’ economies generate higher interconnectedness, which in turn leads to higher systemic risk. Third, the structure of the network formed depends on whether the shocks to the system are believed to be correlated or independent of each other. In particular, when shocks are perfectly correlated, the network formed is a complete graph, i.e., a link between every pair of agents. This underlines the importance of specifying the shock structure before investigating a given network because a given network and shock structure could be incompatible. Download Paper
The National Resident Matching program strives for a stable matching of medical students to teaching hospitals. With the presence of couples, stable matchings need not exist. For any student preferences, we show that each instance of a stable matching problem has a ’nearby’ instance with a table matching. The nearby instance is obtained by perturbing the capacities of the hospitals. Specifically, given a reported capacity for each hospital h, we find a redistribution of the slot capacities k¹h satisfying [kh –k¹h] ≤ 4 for all hospital h, and ∑h kh ≤ ∑ k¹h ≤ ∑h kh + 9, such that a stable matching exists with respect to k¹. Our approach is general and applies to other type of complementarities, as well as matchings with side constraints and contracts. Download Paper
As demand increases, airline carriers often increase flight frequencies to meet the larger flow of passengers in their networks, which reduces passengers' schedule delays and attracts more demand. Motivated by this, I study a structural model of the U.S. airline industry accounting for possible network effects of demand compared with previous studies, the model implies higher cost estimates, which seem more consistent with the unprofitability of the industry; below-marginal-cost pricing becomes possible and appears on many routes. I also study airline mergers and find that the network effects can be the main factor underlying their profitability. Download Paper
We propose a formal model of scientific modeling, geared to applications of decision theory and game theory. The model highlights the freedom that modelers have in conceptualizing social phenomena using general paradigms in these fields. It may shed some light on the distinctions between (i) refutation of a theory and a paradigm, (ii) notions of rationality, (iii) modes of application of decision models, and (iv) roles of economics as an academic discipline. Moreover, the model suggests that all four distinctions have some common features that are captured by the model. Download Paper
To end a financial crisis, the central bank is to lend freely, against good collateral, at a high rate, according to Bagehot’s Rule. We argue that in theory and in practice there is a missing ingredient to Bagehot’s Rule: secrecy. Re-creating confidence requires that the central bank lend in secret, hiding the identities of the borrowers, to prevent information about individual collateral from being produced and to create an information externality by raising the perceived value of average collateral. Ironically, the participation of "bad" borrowers, with low quality collateral, in the central bank’s lending program is a desirable part of re-creating confidence because it creates stigma. Stigma is critical to sustain secrecy because no borrower wants to reveal his participation in the lending program, and it is limited by the central bank charging a high rate for its loans. Download Paper
We show that political booms, measured by the rise in governments’ popularity, predict financial crises above and beyond other better-known early warning indicators, such as credit booms. This predictive power, however, only holds in emerging economies. We show that governments in emerging economies are more concerned about their reputation and tend to ride the short-term popularity benefits of weak credit booms rather than implementing politically costly corrective policies that would help prevent potential crises. We provide evidence of the relevance of this reputation mechanism. Download Paper
This paper constructs a dynamic model of health insurance to evaluate the short- and long-run effects of policies that prevent firms from conditioning wages on health conditions of their workers, and that prevent health insurance companies from charging individuals with adverse health conditions higher insurance premia. Our study is motivated by recent US legislation that has tightened regulations on wage discrimination against workers with poorer health status (such as Americans with Disability Act of 1990, ADA, and its amendment in 2008, the ADAAA) and that prohibits health insurance companies from charging different premiums for workers of different health status starting in 2014 (Patient Protection and Affordable Care Act, PPACA). In the model, a trade-off arises between the static gains from better insurance against poor health induced by these policies and their adverse dynamic incentive effects on household efforts to lead a healthy life. Using household panel data from the PSID we estimate and calibrate the model and then use it to evaluate the static and dynamic consequences of no-wage discrimination and no-prior conditions laws for the evolution of the cross-sectional health and consumption distribution of a cohort of households, as well as ex-ante lifetime utility of a typical member of this cohort. In our quantitative analysis we find that although the competitive equilibrium features too little consumption insurance and a combination of both policies is effective in providing such insurance period by period, it is suboptimal to introduce both policies jointly since such a policy innovation severely undermines the incentives to lead healthier lives and thus induces a more rapid deterioration of the cohort health distribution over time. This effect more than offsets the static gains from better consumption insurance so that expected discounted lifetime utility is lower under both policies, relative to implementing wage nondiscrimination legislation alone. This is true despite the fact that both policy options are strongly welfare improving relative to the competitive equilibrium. Download Paper
Banks are optimally opaque institutions. They produce debt for use as a transaction medium (bank money), which requires that information about the backing assets – loans – not be revealed, so that bank money does not fluctuate in value, reducing the efficiency of trade. This need for opacity conflicts with the production of information about investment projects, needed for allocative efficiency. Intermediaries exist to hide such information, so banks select portfolios of information-insensitive assets. For the economy as a whole, firms endogenously separate into bank finance and capital market/stock market finance depending on the cost of producing information about their projects. Download Paper
This paper investigates how cooperation can be sustained in large societies even in the presence of agents who never cooperate. In order to do this, we consider a large but finite society where in each period agents are randomly matched in pairs. Nature decides, within each match, which agent needs help in order to avoid some loss, and which agent can help him incurring a cost smaller than the loss. Each agent observes only his own history, and we assume that agents do not recognize each other. We introduce and characterize a class of equilibria, named linear equilibria, in which cooperation takes place. Within this class, efficiency can be achieved with simple one-period strategies, which are close to a tit-for-tat strategy when the society is large, and which generate smooth dynamics of the expected average level of cooperation. Unlike previously suggested equilibria in similar environments, our equilibria are robust to the presence of behavioral agents and other perturbations of the base model. We also apply our model to bilateral trade with many traders, where we find that the mechanism of transmitting defections is transmissive and not contagious as in our base model. Download Paper
This paper investigates empirically the fiscal and welfare trade-offs involved in designing a pension system when workers can avoid participation by working informally. A dynamic behavioral model captures a household's labor supply, formal/informal sector choice and saving decisions under the rules of Chile's canonical privatized pension system. The parameters governing household preferences and earnings opportunities in the formal and the informal sector are jointly estimated using a longitudinal survey linked with administrative data from the pension system's regulatory agency. The parameter estimates imply that formal jobs rationing is limited and that mandatory pension contributions play an sizeable role in encouraging informality. Our policy experiments show that Chile could achieve a reduction of 23% of minimum pension costs, while guarantying the same level of income in retirement, by increasing the rate at which the benefits taper off. Download Paper
Asymmetric information is an important source of inefficiency when an asset (such as a firm) is transacted. The two main sources of this asymmetry are the unobserved idiosyncratic characteristics of the asset (such as future profitability) and unobserved idiosyncratic choices (like secret price cuts). Buyers may use noisy signals (such as sales) in order to infer actions and characteristics. In this situation, does the seller prefer to release information fast or slowly? Is it incentive compatible? When the market is pessimistic, is it better to give up or keep signaling? We introduce hidden actions in a dynamic signaling model in order to answer these questions. Separation is found to be fast in equilibrium when sending highly informative signals is more efficient than sending lowly informative signals. When the market is pessimistic about the quality of the asset, depending on the cost structure, the seller either “gives-up” by stopping signaling, or the seller “rushes-out” by increasing the informativeness of the signal. We find that the unobservability of the action causes equilibrium effort to be too low and the seller to stop signaling too early. The model can be applied to education where grades depend on students’ effort, which is endogenously related to their skills. Download Paper
The paper studies inference in nonlinear models where identification loss presents in multiple parts of the parameter space. For uniform inference, we develop a local limit theory that models mixed identification strength. Building on this non-standard asymptotic approximation, we suggest robust tests and confidence intervals in the presence of non-identified and weakly identified nuisance parameters. In particular, this covers applications where some nuisance parameters are non-identified under the null (Davies (1977, 1987)) and some nuisance parameters are subject to a full range of identification strength. The asymptotic results involve both inconsistent estimators that depend on a localization parameter and consistent estimators with different rates of convergence. A sequential argument is used to peel the criterion function based on identification strength of the parameters. The robust test is uniformly valid and non-conservative. Download Paper
I provide empirical evidence of changes in the U.S. Treasury yield curve and related macroeconomic factors, and investigate whether the changes are brought about by external shocks, monetary policy, or by both. To explore this, I characterize bond market exposures to macroeconomic and monetary policy risks, using an equilibrium term structure model with recursive preferences in which inflation dynamics are endogenously determined. In my model, the key risks that affect bond market prices are changes in the correlation between growth and inflation and changes in the conduct of monetary policy. Using a novel estimation technique, I find that the changes in monetary policy affect the volatility of yield spreads, while the changes in the correlation between growth and inflation affect both the level as well as the volatility of yield spreads. Consequently, the changes in the correlation structure are the main contributor to bond risk premia and to bond market volatility. The time variations within a regime and risks associated with moving across regimes lead to the failure of the Expectations Hypothesis and to the excess bond return predictability regression of Cochrane and Piazzesi (2005), as in the data. Download Paper
We consider the dynamic pricing problem of a durable good monopolist with full commitment power, when a new version of the good is expected at some point in the future. The new version of the good is superior to the existing one, bringing a higher ow utility. If the arrival is a stationary stochastic process, then the corresponding optimal price path is shown to be constant for both versions of the good, hence there is no delay on purchases and time is not used to discriminate over buyers, which is in line with the literature. However, if the arrival of the new version occurs at a commonly known deterministic date, then the optimal price path may be decreasing over time, resulting in delayed purchases. For both stochastic and deterministic arrival processes, posted prices is not the optimal mechanism, which on the other hand, involves into bundling of both new and old versions of the good and selling them only together. Download Paper
The recent public debt crisis in most developed economies implies an urgent need for increasing tax revenues or cutting government spending. In this paper we study the importance of household heterogeneity and the progressivity of the labor income tax schedule for the ability of the government to generate tax revenues. We develop an overlapping generations model with uninsurable idiosyncratic risk, endogenous human capital accumulation as well as labor supply decisions along the intensive and extensive margins. We calibrate the model to macro, micro and tax data from the US as well as a number of European countries, and then for each country characterize the labor income tax Laffer curve under the current country-specific choice of the progressivity of the labor income tax code. We find that more progressive labor income taxes significantly reduce tax revenues. For the US, converting to a flat tax code raises the peak of the laffer curve by 7%. We also find that modeling household heterogeneity is important for the shape of the Laffer curve. Download Paper
This paper analyzes a dynamic education signaling model with dropout risk. Workers pay an education cost per unit of time and face an exogenous dropout risk before graduation. Since low-productivity workers’ cost of education is high, pooling with early dropouts helps them avoid a high education cost. In equilibrium, low-productivity workers choose to endoge- nously drop out over time, so the productivity of workers in college increases as the education process progresses. We find that the exogenous dropout risk generates rich dynamics in the endogenous dropout behavior of workers, and the maximum education length is decreasing in the prior about a worker being highly productive. We also extend the baseline model by allowing human capital accumulation. Download Paper
The last three recessions in the United States were followed by jobless recoveries: while labor productivity recovered, unemployment remained high. In this paper, we show that countercyclical unemployment benefit extensions lead to jobless recoveries. We augment the standard Mortensen-Pissarides model to incorporate unemployment benefits expiration and state-dependent extensions of unemployment benefits. In the model, an extension of unemployment benefits slows down the recovery of vacancy creation in the aftermath of a recession. We calibrate the model to US data and show that it is quantitatively consistent with observed labor market dynamics, in particular the emergence of jobless recoveries after 1990. Furthermore, counterfactual experiments indicate that unemployment benefits are quantitatively important in explaining jobless recoveries. Download Paper
We propose a new category of consumption goods, memorable goods, that generate a flow of utility after consumption. We analyze an otherwise standard consumption model that distinguishes memorable goods from other nondurable goods. Consumers optimally choose lumpy consumption of memorable goods. We then empirically document significant differences between levels and volatilities of memorable and other nondurable good expenditures. In two applications we find that the welfare cost of consumption fluctuations driven by income shocks are significantly overstated if memorable goods are not accounted for and that estimates of excess sensitivity of consumption might be entirely due to memorable goods. Download Paper
Research from the United States shows that gaps in early cognitive and non-cognitive ability appear early in the life cycle. Little is known about this important question for developing countries. This paper provides new evidence of sharp differences in cognitive development by socioeconomic status in early childhood for five Latin American countries. To help with comparability, we use the same measure of receptive language ability for all five countries. We find important differences in development in early childhood across countries, and steep socioeconomic gradients within every country. For the three countries where we can follow children over time, there are few substantive changes in scores once children enter school. Our results are robust to different ways of defining socioeconomic status, to different ways of standardizing outcomes, and to selective non-response on our measure of cognitive development. Download Paper
The paper studies equilibrium pricing in a product market for an indivisible good where buyers search for sellers. Buyers search sequentially for sellers, but do not meet every sellers with the same probability. Specifically, a fraction of the buyers’ meetings lead to one particular large seller, while the remaining meetings lead to one of a continuum of small sellers. In this environment, the small sellers would like to set a price that makes the buyers indifferent between purchasing the good and searching for another seller. The large seller would like to price the small sellers out of the market by posting a price that is low enough to induce buyers not to purchase from the small sellers. These incentives give rise to a game of cat and mouse, whose only equilibrium involves mixed strategies for both the large and the small sellers. The fact that the small sellers play mixed strategies implies that there is price dispersion. The fact that the large seller plays mixed strategies implies that prices and allocations vary over time. We show that the fraction of the gains from trade accruing to the buyers is positive and non-monotonic in the degree of market power of the large seller. As long as the large seller has some positive but incomplete market power, the fraction of the gains from trade accruing to the buyers depends in a natural way on the extent of search frictions. Download Paper
Credit booms usually precede financial crises. However, some credit booms end in a crisis (bad booms) and other booms do not (good booms). We document that, while all booms start with an increase in the growth of Total Factor Productivity (TFP), such growth falls much faster subsequently for bad booms. We then develop a simple framework to explain this. Firms finance investment opportunities with short-term collateralized debt. If agents do not produce information about the collateral quality, a credit boom develops, accommodating firms with lower quality projects and increasing the incentives of lenders to acquire information about the collateral, eventually triggering a crisis. When the quality of investment opportunities also grow, the credit boom may not end in a crisis because there is a gradual adoption of low quality projects, but those projects are also of better quality, not inducing information about collateral. Download Paper
This paper demonstrates that a misspecified model of information processing interferes with long-run learning and offers an explanation for why individuals may continue to choose an inefficient action, despite sufficient public information to learn the true state. I consider a social learning environment where agents draw inference from private signals, public signals and the actions of their predecessors, and sufficient public information exists to achieve asymptotically efficient learning. Prior actions aggregate multiple sources of information; agents face an inferential challenge to distinguish new information from redundant information. I show that when individuals significantly overestimate the amount of new information contained in prior actions, beliefs about the unknown state become entrenched and incorrect learning may occur. On the other hand, when individuals sufficiently overestimate the amount of redundant information, beliefs are fragile and learning is incomplete. When agents have an approximately correct model of inference, learning is complete - the model with no information-processing bias is robust to perturbation. Download Paper
This paper formalizes the design of experiments intended specifically to study spillover effects. By first randomizing the intensity of treatment within clusters and then randomly assigning individual treatment conditional on this cluster-level intensity, a novel set of treatment effects can be identified. We develop a formal framework for consistent estimation of these effects, and provide explicit expressions for power calculations. We show that the power to detect average treatment effects declines precisely with the quantity that identifies the novel treatment effects. A demonstration of the technique is provided using a cash transfer program in Malawi. Download Paper
Many violations of the Independence axiom of Expected Utility can be traced to subjects' attraction to risk-free prospects. The key axiom in this paper, Negative Certainty Independence (Dillenberger, 2010), formalizes this tendency. Our main result is a utility representation of all preferences over monetary lotteries that satisfy Negative Certainty Independence together with basic rationality postulates. Such preferences can be represented as if the agent were unsure of how to evaluate a given lottery p; instead, she has in mind a set of possible utility functions over outcomes and displays a cautious behavior: she computes the certainty equivalent of p with respect to each possible function in the set and picks the smallest one. The set of utilities is unique in a well-defined sense. We show that our representation can also be derived from a `cautious' completion of an incomplete preference relation. Download Paper
This paper argues that openness to new, unconventional and disruptive ideas has a .first-order impact on creative innovations - innovations that break new ground in terms of knowledge creation. After presenting a motivating model focusing on the choice between incremental and radical innovation, and on how managers of different ages and human capital are sorted across different types of firms, we provide cross-country, firm-level and patent-level evidence consistent with this pattern. Our measures of creative innovations proxy for innovation quality (average number of citations per patent) and creativity (fraction of superstar innovators, the likelihood of a very high number of citations, and generality of patents). Our main proxy for openness to disruption is manager age. This variable is based on the idea that only companies or societies open to such disruption will allow the young to rise up within the hierarchy. Using this proxy at the country, firm or patent level, we present robust evidence that openness to disruption is associated with more creative innovations. Download Paper
It is commonly believed that, since unanimity rule safeguards the rights of each individual, it protects minorities from the possibility of expropriation, thus yielding more equitable outcomes than majority rule. We show that this is not necessarily the case in bargaining environments. We study a multilateral bargaining model à la Baron and Ferejohn (1989), where players are heterogeneous with respect to the potential surplus they bring to the bargaining table. We show that unanimity rule may generate equilibrium outcomes that are more unequal (or less equitable) than under majority rule. In fact, as players become perfectly patient, we show that the more inclusive the voting rule, the less equitable the equilibrium allocations. Download Paper
This paper is a study of the shape and structure of the distribution of prices at which an identical good is sold in a given market and time period. We find that the typical price distribution is symmetric and leptokurtic, with a standard deviation between 19% and 36%. Only 10% of the variance of prices is due to variation in the expensiveness of the stores at which a good is sold, while the remaining 90% is due, in approximately equal parts, to differences in the average price of a good across equally expensive stores and to differences in the price of a good across transactions at the same store. We show that the distribution of prices that households pay for the same bundle of goods is approximately Normal, with a standard deviation between 9% and 14%. Half of this dispersion is due to differences in the expensiveness of the stores where households shop, while the other half is mostly due to differences in households’ choices of which goods to purchase at which stores. We find that households with fewer employed members pay lower prices, and do so by visiting a larger number of stores, rather than by shopping more frequently. Download Paper
We consider all-pay auctions in the presence of interdependent, affiliated valuations and private budget constraints. For the sealed-bid, all-pay auction we characterize a symmetric equilibrium in continuous strategies for the case of N bidders. Budget constraints encourage more aggressive bidding among participants with large endowments and intermediate valuations. We extend our results to the war of attrition where we show that budget constraints lead to a uniform amplification of equilibrium bids among bidders with sufficient endowments. An example shows that with both interdependent valuations and private budget constraints, a revenue ranking between the two auction formats is generally not possible. Equilibria with discontinuous bidding strategies are discussed. Download Paper
What has caused the rising gap in health insurance coverage by education in the U.S. over the last thirty years? How does the employment-based health insurance market interact with the labor market? What are the effects of social insurance such as Medicaid? By developing and structurally estimating an equilibrium model, I find that the interaction between labor market technological changes and the cost growth of medical services explains 60% to 70% of the gap. Using counterfactual experiments, I also evaluate the impact of further Medicaid eligibility expansion and employer mandates introduced in the Affordable Care Act on labor and health insurance markets. Download Paper
Using a connectedness-measurement technology fundamentally grounded in modern network theory, we measure real output connectedness for a set of six developed countries, 1962-2010. We show that global connectedness is sizable and time-varying over the business cycle, and we study the nature of the time variation relative to the ongoing discussion about the changing nature of the global business cycle. We also show that connectedness corresponding to transmissions to others from the United States and Japan is disproportionately important. Download Paper
An endogenous growth model is developed where each period firms invest in researching and developing new ideas. An idea increases a firm's productivity. By how much depends on how central the idea is to a firm's activity. Ideas can be bought and sold on a market for patents. A firm can sell an idea that is not relevant to its business or buy one if it fails to innovate. The developed model is matched up with stylized facts about the market for patents in the U.S. The analysis attempts to gauge how efficiency in the patent market affects growth. Download Paper
Maximizing subjective expected utility is the classic model of decision-making under uncertainty. Savage (1954) provides axioms on preference over acts that are equivalent to the existence of a subjective expected utility representation, and further establishes that such a representation is essentially unique. We show that there is a continuum of other "expected utility" representations in which the probability distributions over states used to evaluate acts depend on the set of possible outcomes of the act and suggest that these alternate representations can capture pessimism or optimism. We then extend the DM's preferences to be defined over both subjective acts and objective lotteries, allowing for source-dependent preferences. Our result permits modeling ambiguity aversion in Ellsberg's two-urn experiment using a single utility function and pessimistic probability assessments over prizes for lotteries and acts, while maintaining the axioms of Savage and von Neumann-Morganstern on the appropriate domains Download Paper
We study an individual who faces a dynamic decision problem in which the process of information arrival is unobserved by the analyst. We elicit subjective information directly from choice behavior by deriving two utility representations of preferences over menus of acts. One representation uniquely identifies information as a probability measure over posteriors and the other identifies information as a partition of the state space. We compare individuals who expect to learn differently in terms of their preference for flexibility. On the extended domain of dated-menus, we show how to accommodate gradual learning over time by means of a subjective filtration. Download Paper