Paper # Author Title
We combine the real business cycle small open economy framework with the endogenous growth literature to study the productivity cost of a sudden stop. In this economy, productivity growth is determined by successful implementation of business ideas, yet the quality of ideas is heterogeneous and good ideas are scarce. A representative financial intermediary screens and selects the most promising ideas, which gives rise to a trade-off between mass (quantity) and composition (quality) in the entrant cohort. Chilean plant-level data from the sudden stop triggered by the Russian sovereign default in 1998 confirms the main mechanism of the model, as firms born during the credit shortage are fewer, but better. A calibrated version of the economy shows the importance of accounting for heterogeneity and selection, as otherwise the permanent loss of output generated by the forgone entrants doubles, which increases the welfare cost by 30%. Download Paper
There are many data sets based on the population of discovered cartels and it is from this data that average cartel duration and the annual probability of cartel death are estimated. It is recognized, however, that these estimates could be biased because the population of discovered cartels may not be a representative sample of the population of cartels. This paper constructs a simple birth-death-discovery process to theoretically investigate what it is we can learn about cartels from data on discovered cartels. Download Paper
As demand increases, airline carriers often increase flight frequencies to meet the larger flow of passengers in their networks, which reduces passengers' schedule delays and attracts more demand. Focusing on the “network effects", this paper develops and estimates a structural model of the U.S. airline industry. Compared with previous studies, the model implies higher cost estimates, which seem more consistent with the unprofitability of the industry; below-marginal-cost pricing becomes possible and appears on many routes. I also study airline mergers and find that the network effects can be the main factor underlying their profitability. Download Paper
This paper develops and implements a new approach for separately identifying preference and opportunity parameters of a two-sided search and matching model in the absence of data on choice sets. This approach exploits information on the dynamics of matches: how long it takes for singles to form matches, what types of matches they form, and how long the matches last. Willingness to accept a certain type of partner can be revealed through the dissolution of matches. Given recovered acceptance rules, the rates at which singles meet different types are inferred from the observed transitions from singlehood to matches. Imposing equilibrium conditions links acceptance rules and arrival rates to underlying preference and opportunity parameters. Using the Panel Study of Income Dynamics, I apply this method to examine the marriage patterns of non-Hispanic whites, non-Hispanic blacks and Hispanics in the United States. Results indicate that the observed infrequency of intermarriage is primarily attributable to a low incidence of interracial/interethnic meetings rather than same-race/ethnicity preferences. Simulations based on the estimated model show the effects of demographic changes on marital patterns. Download Paper
How much additional tax revenue can the government generate by increasing labor income taxes? In this paper we provide a quantitative answer to this question, and study the importance of the progressivity of the tax schedule for the ability of the government to generate tax revenues. We develop a rich overlapping generations model featuring an explicit family structure, extensive and intensive margins of labor supply, endogenous accumulation of labor market experience as well as standard intertemporal consumption-savings choices in the presence of uninsurable idiosyncratic labor productivity risk. We calibrate the model to US macro, micro and tax data and characterize the labor income tax Laffer curve under the current choice of the progressivity of the labor income tax code as well as when varying progressivity. We find that more progressive labor income taxes significantly reduce tax revenues. For the US, converting to a flat tax code raises the peak of the Laffer curve by 6%, whereas converting to a tax system with progressivity similar to Denmark, would lower the peak by 7%. We also show that, relative to a representative agent economy tax revenues are less sensitive to the progressivity of the tax code in our economy. This finding is due to the fact that labor supply of two earner households is less elastic (along the intensive margin) and the endogenous accumulation of labor market experience makes labor supply of females less elastic (around the extensive margin) to changes in tax progressivity. Download Paper
We propose point forecast accuracy measures based directly on distance of the forecast-error c.d.f. from the unit step function at 0 (\stochastic error distance," or SED). We provide a precise characterization of the relationship between SED and standard predictive loss functions, showing that all such loss functions can be written as weighted SED's. The leading case is absolute-error loss, in which the SED weights are unity, establishing its primacy. Among other things, this suggests shifting attention away from conditional-mean forecasts and toward conditional-median forecasts. Download Paper
In finite samples, the use of a slightly endogenous but highly relevant instrument can reduce mean-squared error (MSE). Building on this observation, I propose a moment selection criterion for GMM in which moment conditions are chosen based on the MSE of their associated estimators rather than their validity: the focused moment selection criterion (FMSC). I then show how the framework used to derive the FMSC can address the problem of inference post-moment selection. Treating post-selection estimators as a special case of moment-averaging, in which estimators based on different moment sets are given data-dependent weights, I propose a simulation-based procedure to construct valid confidence intervals for a variety of formal and informal moment-selection procedures. Both the FMSC and confidence interval procedure perform well in simulations. I conclude with an empirical example examining the effect of instrument selection on the estimated relationship between malaria transmission and per-capita income. Download Paper
In this paper we argue that very high marginal labor income tax rates are an effective tool for social insurance even when households have preferences with high labor supply elasticity, make dynamic savings decisions, and policies have general equilibrium effects. To make this point we construct a large scale Overlapping Generations Model with uninsurable labor productivity risk, show that it has a wealth distribution that matches the data well, and then use it to characterize fiscal policies that achieve a desired degree of redistribution in society. We find that marginal tax rates on the top 1% of the earnings distribution of close to 90% are optimal. We document that this result is robust to plausible variation in the labor supply elasticity and holds regardless of whether social welfare is measured at the steady state only or includes transitional generations. Download Paper
We propose and solve a small-scale New-Keynesian model with Markov sunspot shocks that move the economy between a targeted-inflation regime and a deflation regime and fit it to data from the U.S. and Japan. For the U.S. we find that adverse demand shocks have moved the economy to the zero lower bound (ZLB) in 2009 and an expansive monetary policy has kept it there subsequently. In contrast, Japan has experienced a switch to the deflation regime in 1999 and remained there since then, except for a short period. The two scenarios have drastically different implications for macroeconomic policies. Fiscal multipliers are about 20% smaller in the deflationary regime, despite the economy remaining at the ZLB. While a commitment by the central bank to keep rates near the ZLB doubles the fiscal multipliers in the targeted-inflation regime (U.S.), it has no effect in the deflation regime (Japan). Download Paper
We provide a novel methodology for estimating time-varying weights in linear prediction pools, which we call Dynamic Pools, and use it to investigate the relative forecasting performance of DSGE models with and without financial frictions for output growth and inflation from 1992 to 2011. We find strong evidence of time variation in the pool's weights, reflecting the fact that the DSGE model with financial frictions produces superior forecasts in periods of financial distress but does not perform as well in tranquil periods. The dynamic pool's weights react in a timely fashion to changes in the environment, leading to real-time forecast improvements relative to other methods of density forecast combination, such as Bayesian Model Averaging, optimal (static) pools, and equal weights. We show how a policymaker dealing with model uncertainty could have used a dynamic pools to perform a counterfactual exercise (responding to the gap in labor market conditions) in the immediate aftermath of the Lehman crisis. Download Paper
This paper studies a class of continuous-time stochastic games in which the actions of a long-run player have a persistent effect on payoffs. For example, the quality of a firm's product depends on past as well as current effort, or the level of a policy instrument depends on a government's past choices. The long-run player faces a population of small players, and its actions are imperfectly observed. I establish the existence of Markov equilibria, characterize the Perfect Public Equilibria (PPE) pay-offset as the convex hull of the Markov Equilibria payoff set, and identify conditions for the uniqueness of a Markov equilibrium in the class of all PPE. The existence proof is constructive: it characterizes the explicit form of Markov equilibria payoffs and actions, for any discount rate. Action persistence creates a crucial new channel to generate intertemporal incentives in a setting where traditional channels fail, and can provide permanent non-trivial incentives in many settings. These results offer a novel framework for thinking about reputational dynamics of firms, governments, and other long-run agents. Download Paper
This paper formalizes the design of experiments intended specifically to study spillover effects. By first randomizing the intensity of treatment within clusters and then randomly assigning individual treatment conditional on this cluster-level intensity, a novel set of treatment effects can be identified. We develop a formal framework for consistent estimation of these effects, and provide explicit expressions for power calculations. We show that the power to detect average treatment effects declines precisely with the quantity that identifies the novel treatment effects. A demonstration of the technique is provided using a cash transfer program in Malawi. Download Paper
Though Mass Open Online Courses are very different from each other in their structure, content or audience, they are all characterized by low completion rates. In this paper, we use the Cox proportional hazard model to analyze student retention in the MOOC Principle of Microeconomics. Using two different measures of retention, video watching and quiz submission, we show that students’ commitment to the course can be strongly predicted by their participation in the first week’s activities. Data collected through a voluntary opt-in survey allow us to study retention in relation to demographics, for a subset of enrollees. We find a higher dropout rate for college students or younger participants. Female attrition is larger when measured by video watching but not when measured by quiz completion. Self-ascribed motivation for taking the course is not a predictor of completion. We conclude that raw completion rates cannot be the criterion to judge the success of MOOCs, as it is in case of traditional courses. The results are consistent with the existing literature which separates MOOCs students into two different groups: Committed Learners and Browsers. Download Paper
The problem of allocating bundles of indivisible objects without transfers arises in the assignment of courses to students, of computing resources like CPU time, memory and disk space to computing tasks and the truck loads of food to food banks. In these settings the complementarities in preferences are small compared with the size of the market. We exploit this to design mechanisms satisfying efficiency, envy-freeness and asymptotic strategy-proofness Informally, we assume that agents do not want bundles that are too large. There will be a parameter k such that the marginal utility of any item relative to a bundle of size k or larger is zero. We call such preferences k-demand preferences. Given this parameter we show how to represent probability shares over bundles as lotteries over approximately (deterministic) feasible integer allocations. The degree of infeasibility in these integer allocations will be controlled by the parameter k. In particular, ex-post, no good is over allocated by at most k -1 units. Download Paper
This paper introduces a model of endogenous network formation and systemic risk. In it, agents form networks that efficiently trade-off the possibility of systemic risk with the benefits of trade. Second, fundamentally ‘safer’ economies generate higher interconnectedness, which in turn leads to higher systemic risk. Third, the structure of the network formed depends on whether the shocks to the system are believed to be correlated or independent of each other. In particular, when shocks are perfectly correlated, the network formed is a complete graph, i.e., a link between every pair of agents. This underlines the importance of specifying the shock structure before investigating a given network because a given network and shock structure could be incompatible. Download Paper
The National Resident Matching program strives for a stable matching of medical students to teaching hospitals. With the presence of couples, stable matchings need not exist. For any student preferences, we show that each instance of a stable matching problem has a ’nearby’ instance with a table matching. The nearby instance is obtained by perturbing the capacities of the hospitals. Specifically, given a reported capacity for each hospital h, we find a redistribution of the slot capacities k¹h satisfying [kh –k¹h] ≤ 4 for all hospital h, and ∑h kh ≤ ∑ k¹h ≤ ∑h kh + 9, such that a stable matching exists with respect to k¹. Our approach is general and applies to other type of complementarities, as well as matchings with side constraints and contracts. Download Paper
As demand increases, airline carriers often increase flight frequencies to meet the larger flow of passengers in their networks, which reduces passengers' schedule delays and attracts more demand. Motivated by this, I study a structural model of the U.S. airline industry accounting for possible network effects of demand compared with previous studies, the model implies higher cost estimates, which seem more consistent with the unprofitability of the industry; below-marginal-cost pricing becomes possible and appears on many routes. I also study airline mergers and find that the network effects can be the main factor underlying their profitability. Download Paper
We propose a formal model of scientific modeling, geared to applications of decision theory and game theory. The model highlights the freedom that modelers have in conceptualizing social phenomena using general paradigms in these fields. It may shed some light on the distinctions between (i) refutation of a theory and a paradigm, (ii) notions of rationality, (iii) modes of application of decision models, and (iv) roles of economics as an academic discipline. Moreover, the model suggests that all four distinctions have some common features that are captured by the model. Download Paper
To end a financial crisis, the central bank is to lend freely, against good collateral, at a high rate, according to Bagehot’s Rule. We argue that in theory and in practice there is a missing ingredient to Bagehot’s Rule: secrecy. Re-creating confidence requires that the central bank lend in secret, hiding the identities of the borrowers, to prevent information about individual collateral from being produced and to create an information externality by raising the perceived value of average collateral. Ironically, the participation of "bad" borrowers, with low quality collateral, in the central bank’s lending program is a desirable part of re-creating confidence because it creates stigma. Stigma is critical to sustain secrecy because no borrower wants to reveal his participation in the lending program, and it is limited by the central bank charging a high rate for its loans. Download Paper
We show that political booms, measured by the rise in governments’ popularity, predict financial crises above and beyond other better-known early warning indicators, such as credit booms. This predictive power, however, only holds in emerging economies. We show that governments in emerging economies are more concerned about their reputation and tend to ride the short-term popularity benefits of weak credit booms rather than implementing politically costly corrective policies that would help prevent potential crises. We provide evidence of the relevance of this reputation mechanism. Download Paper
This paper constructs a dynamic model of health insurance to evaluate the short- and long-run effects of policies that prevent firms from conditioning wages on health conditions of their workers, and that prevent health insurance companies from charging individuals with adverse health conditions higher insurance premia. Our study is motivated by recent US legislation that has tightened regulations on wage discrimination against workers with poorer health status (such as Americans with Disability Act of 1990, ADA, and its amendment in 2008, the ADAAA) and that prohibits health insurance companies from charging different premiums for workers of different health status starting in 2014 (Patient Protection and Affordable Care Act, PPACA). In the model, a trade-off arises between the static gains from better insurance against poor health induced by these policies and their adverse dynamic incentive effects on household efforts to lead a healthy life. Using household panel data from the PSID we estimate and calibrate the model and then use it to evaluate the static and dynamic consequences of no-wage discrimination and no-prior conditions laws for the evolution of the cross-sectional health and consumption distribution of a cohort of households, as well as ex-ante lifetime utility of a typical member of this cohort. In our quantitative analysis we find that although the competitive equilibrium features too little consumption insurance and a combination of both policies is effective in providing such insurance period by period, it is suboptimal to introduce both policies jointly since such a policy innovation severely undermines the incentives to lead healthier lives and thus induces a more rapid deterioration of the cohort health distribution over time. This effect more than offsets the static gains from better consumption insurance so that expected discounted lifetime utility is lower under both policies, relative to implementing wage nondiscrimination legislation alone. This is true despite the fact that both policy options are strongly welfare improving relative to the competitive equilibrium. Download Paper
Banks are optimally opaque institutions. They produce debt for use as a transaction medium (bank money), which requires that information about the backing assets – loans – not be revealed, so that bank money does not fluctuate in value, reducing the efficiency of trade. This need for opacity conflicts with the production of information about investment projects, needed for allocative efficiency. Intermediaries exist to hide such information, so banks select portfolios of information-insensitive assets. For the economy as a whole, firms endogenously separate into bank finance and capital market/stock market finance depending on the cost of producing information about their projects. Download Paper
This paper investigates how cooperation can be sustained in large societies even in the presence of agents who never cooperate. In order to do this, we consider a large but finite society where in each period agents are randomly matched in pairs. Nature decides, within each match, which agent needs help in order to avoid some loss, and which agent can help him incurring a cost smaller than the loss. Each agent observes only his own history, and we assume that agents do not recognize each other. We introduce and characterize a class of equilibria, named linear equilibria, in which cooperation takes place. Within this class, efficiency can be achieved with simple one-period strategies, which are close to a tit-for-tat strategy when the society is large, and which generate smooth dynamics of the expected average level of cooperation. Unlike previously suggested equilibria in similar environments, our equilibria are robust to the presence of behavioral agents and other perturbations of the base model. We also apply our model to bilateral trade with many traders, where we find that the mechanism of transmitting defections is transmissive and not contagious as in our base model. Download Paper
This paper investigates empirically the fiscal and welfare trade-offs involved in designing a pension system when workers can avoid participation by working informally. A dynamic behavioral model captures a household's labor supply, formal/informal sector choice and saving decisions under the rules of Chile's canonical privatized pension system. The parameters governing household preferences and earnings opportunities in the formal and the informal sector are jointly estimated using a longitudinal survey linked with administrative data from the pension system's regulatory agency. The parameter estimates imply that formal jobs rationing is limited and that mandatory pension contributions play an sizeable role in encouraging informality. Our policy experiments show that Chile could achieve a reduction of 23% of minimum pension costs, while guarantying the same level of income in retirement, by increasing the rate at which the benefits taper off. Download Paper
Asymmetric information is an important source of inefficiency when an asset (such as a firm) is transacted. The two main sources of this asymmetry are the unobserved idiosyncratic characteristics of the asset (such as future profitability) and unobserved idiosyncratic choices (like secret price cuts). Buyers may use noisy signals (such as sales) in order to infer actions and characteristics. In this situation, does the seller prefer to release information fast or slowly? Is it incentive compatible? When the market is pessimistic, is it better to give up or keep signaling? We introduce hidden actions in a dynamic signaling model in order to answer these questions. Separation is found to be fast in equilibrium when sending highly informative signals is more efficient than sending lowly informative signals. When the market is pessimistic about the quality of the asset, depending on the cost structure, the seller either “gives-up” by stopping signaling, or the seller “rushes-out” by increasing the informativeness of the signal. We find that the unobservability of the action causes equilibrium effort to be too low and the seller to stop signaling too early. The model can be applied to education where grades depend on students’ effort, which is endogenously related to their skills. Download Paper
The paper studies inference in nonlinear models where identification loss presents in multiple parts of the parameter space. For uniform inference, we develop a local limit theory that models mixed identification strength. Building on this non-standard asymptotic approximation, we suggest robust tests and confidence intervals in the presence of non-identified and weakly identified nuisance parameters. In particular, this covers applications where some nuisance parameters are non-identified under the null (Davies (1977, 1987)) and some nuisance parameters are subject to a full range of identification strength. The asymptotic results involve both inconsistent estimators that depend on a localization parameter and consistent estimators with different rates of convergence. A sequential argument is used to peel the criterion function based on identification strength of the parameters. The robust test is uniformly valid and non-conservative. Download Paper
I provide empirical evidence of changes in the U.S. Treasury yield curve and related macroeconomic factors, and investigate whether the changes are brought about by external shocks, monetary policy, or by both. To explore this, I characterize bond market exposures to macroeconomic and monetary policy risks, using an equilibrium term structure model with recursive preferences in which inflation dynamics are endogenously determined. In my model, the key risks that affect bond market prices are changes in the correlation between growth and inflation and changes in the conduct of monetary policy. Using a novel estimation technique, I find that the changes in monetary policy affect the volatility of yield spreads, while the changes in the correlation between growth and inflation affect both the level as well as the volatility of yield spreads. Consequently, the changes in the correlation structure are the main contributor to bond risk premia and to bond market volatility. The time variations within a regime and risks associated with moving across regimes lead to the failure of the Expectations Hypothesis and to the excess bond return predictability regression of Cochrane and Piazzesi (2005), as in the data. Download Paper
We consider the dynamic pricing problem of a durable good monopolist with full commitment power, when a new version of the good is expected at some point in the future. The new version of the good is superior to the existing one, bringing a higher ow utility. If the arrival is a stationary stochastic process, then the corresponding optimal price path is shown to be constant for both versions of the good, hence there is no delay on purchases and time is not used to discriminate over buyers, which is in line with the literature. However, if the arrival of the new version occurs at a commonly known deterministic date, then the optimal price path may be decreasing over time, resulting in delayed purchases. For both stochastic and deterministic arrival processes, posted prices is not the optimal mechanism, which on the other hand, involves into bundling of both new and old versions of the good and selling them only together. Download Paper
The recent public debt crisis in most developed economies implies an urgent need for increasing tax revenues or cutting government spending. In this paper we study the importance of household heterogeneity and the progressivity of the labor income tax schedule for the ability of the government to generate tax revenues. We develop an overlapping generations model with uninsurable idiosyncratic risk, endogenous human capital accumulation as well as labor supply decisions along the intensive and extensive margins. We calibrate the model to macro, micro and tax data from the US as well as a number of European countries, and then for each country characterize the labor income tax Laffer curve under the current country-specific choice of the progressivity of the labor income tax code. We find that more progressive labor income taxes significantly reduce tax revenues. For the US, converting to a flat tax code raises the peak of the laffer curve by 7%. We also find that modeling household heterogeneity is important for the shape of the Laffer curve. Download Paper
This paper analyzes a dynamic education signaling model with dropout risk. Workers pay an education cost per unit of time and face an exogenous dropout risk before graduation. Since low-productivity workers’ cost of education is high, pooling with early dropouts helps them avoid a high education cost. In equilibrium, low-productivity workers choose to endoge- nously drop out over time, so the productivity of workers in college increases as the education process progresses. We find that the exogenous dropout risk generates rich dynamics in the endogenous dropout behavior of workers, and the maximum education length is decreasing in the prior about a worker being highly productive. We also extend the baseline model by allowing human capital accumulation. Download Paper
The last three recessions in the United States were followed by jobless recoveries: while labor productivity recovered, unemployment remained high. In this paper, we show that countercyclical unemployment benefit extensions lead to jobless recoveries. We augment the standard Mortensen-Pissarides model to incorporate unemployment benefits expiration and state-dependent extensions of unemployment benefits. In the model, an extension of unemployment benefits slows down the recovery of vacancy creation in the aftermath of a recession. We calibrate the model to US data and show that it is quantitatively consistent with observed labor market dynamics, in particular the emergence of jobless recoveries after 1990. Furthermore, counterfactual experiments indicate that unemployment benefits are quantitatively important in explaining jobless recoveries. Download Paper
We propose a new category of consumption goods, memorable goods, that generate a flow of utility after consumption. We analyze an otherwise standard consumption model that distinguishes memorable goods from other nondurable goods. Consumers optimally choose lumpy consumption of memorable goods. We then empirically document significant differences between levels and volatilities of memorable and other nondurable good expenditures. In two applications we find that the welfare cost of consumption fluctuations driven by income shocks are significantly overstated if memorable goods are not accounted for and that estimates of excess sensitivity of consumption might be entirely due to memorable goods. Download Paper
Research from the United States shows that gaps in early cognitive and non-cognitive ability appear early in the life cycle. Little is known about this important question for developing countries. This paper provides new evidence of sharp differences in cognitive development by socioeconomic status in early childhood for five Latin American countries. To help with comparability, we use the same measure of receptive language ability for all five countries. We find important differences in development in early childhood across countries, and steep socioeconomic gradients within every country. For the three countries where we can follow children over time, there are few substantive changes in scores once children enter school. Our results are robust to different ways of defining socioeconomic status, to different ways of standardizing outcomes, and to selective non-response on our measure of cognitive development. Download Paper
The paper studies equilibrium pricing in a product market for an indivisible good where buyers search for sellers. Buyers search sequentially for sellers, but do not meet every sellers with the same probability. Specifically, a fraction of the buyers’ meetings lead to one particular large seller, while the remaining meetings lead to one of a continuum of small sellers. In this environment, the small sellers would like to set a price that makes the buyers indifferent between purchasing the good and searching for another seller. The large seller would like to price the small sellers out of the market by posting a price that is low enough to induce buyers not to purchase from the small sellers. These incentives give rise to a game of cat and mouse, whose only equilibrium involves mixed strategies for both the large and the small sellers. The fact that the small sellers play mixed strategies implies that there is price dispersion. The fact that the large seller plays mixed strategies implies that prices and allocations vary over time. We show that the fraction of the gains from trade accruing to the buyers is positive and non-monotonic in the degree of market power of the large seller. As long as the large seller has some positive but incomplete market power, the fraction of the gains from trade accruing to the buyers depends in a natural way on the extent of search frictions. Download Paper
Credit booms usually precede financial crises. However, some credit booms end in a crisis (bad booms) and other booms do not (good booms). We document that, while all booms start with an increase in the growth of Total Factor Productivity (TFP), such growth falls much faster subsequently for bad booms. We then develop a simple framework to explain this. Firms finance investment opportunities with short-term collateralized debt. If agents do not produce information about the collateral quality, a credit boom develops, accommodating firms with lower quality projects and increasing the incentives of lenders to acquire information about the collateral, eventually triggering a crisis. When the quality of investment opportunities also grow, the credit boom may not end in a crisis because there is a gradual adoption of low quality projects, but those projects are also of better quality, not inducing information about collateral. Download Paper