A Foundation for Markov Equilibria with Finite Social Memory

We study stochastic games with an infinite horizon and sequential moves played by an arbitrary number of players. We assume that social memory is finite---every player, except possibly one, is finitely lived and cannot observe events that are sufficiently far back in the past. This class of games includes games between a long-run player and a sequence of short-run players and games with overlapping generations of players. Indeed, any stochastic game with infinitely lived players can be reinterpreted as one with finitely lived players: Each finitely-lived player is replaced by a successor, and receives the value of the successor's payoff. This value may arise from altruism, but the player also receives such a value if he can “sell” his position in a competitive market. In both cases, his objective will be to maximize infinite horizon payoffs, though his information on past events will be limited. An equilibrium is purifiable if close-by behavior is consistent with equilibrium when agents' payoffs in each period are perturbed additively and independently. We show that only Markov equilibria are purifiable when social memory is finite. Thus if a game has at most one long-run player, all purifiable equilibria are Markov.

Download Paper

Paper Number
12-003
Year
2012