Informational Herding with Model Misspecification, Second Version
This paper demonstrates that a misspecified model of information processing interferes with long-run learning and allows inefficient choices to persist in the face of contradictory public information. I consider an observational learning environment where agents observe a private signal about a hidden state, and some agents observe the actions of their predecessors. Prior actions aggregate multiple sources of correlated information about the state, and agents face an inferential challenge to distinguish between new and redundant information. When individuals significantly overestimate the amount of new information, beliefs about the state become entrenched and incorrect learning may occur. When individuals sufficiently overestimate the amount of redundant information, beliefs are fragile and learning is incomplete. Learning is complete when agents have an approximately correct model of inference, establishing that the correct model is robust to perturbation. These results have important implications for timing, frequency and strength of policy interventions to facilitate learning.