We examine a two-player game with two-armed exponential bandits à la (Keller et al. in Econometrica 73:39–68, 2005), where players operate different technologies for exploring the risky option. We characterise the set of Markov perfect equilibria and show that there always exists an equilibrium in which the player with the inferior technology uses a cut-off strategy. All Markov perfect equilibria imply the same amount of experimentation but differ with respect to the expected speed of the resolution of uncertainty. If and only if the degree of asymmetry between the players is high enough, there exists a Markov perfect equilibrium in which both players use cut-off strategies. Whenever this equilibrium exists, it welfare dominates all other e...
This article considers a class of experimentation games with Lévy bandits encompassing those of Bolt...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We analyze a two-player game of strategic experimentation with two-armed bandits. Each player has to...
We examine a two-player game with two-armed exponential bandits à la (Keller et al. in Econometrica ...
This is the author accepted manuscript. The final version is available from Springer via the DOI in ...
We analyze a game of strategic experimentation with two-armed bandits whose risky arm might yield pa...
This paper studies a game of strategic experimentation with two-armed bandits whose risky arm might ...
We analyze a two-player game of strategic experimentation with two-armed bandits. Each player has to...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We analyze a two-player game of strategic experimentation with two-armed bandits. Each player has to...
[This item is a preserved copy. To view the original, visit http://econtheory.org/] We stu...
This paper studies a game of strategic experimentation with two-armed bandits whose risky arm might ...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We study a class of symmetric strategic experimentation games. Each of two players faces an (exponen...
This article considers a class of experimentation games with Lévy bandits encompassing those of Bolt...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We analyze a two-player game of strategic experimentation with two-armed bandits. Each player has to...
We examine a two-player game with two-armed exponential bandits à la (Keller et al. in Econometrica ...
This is the author accepted manuscript. The final version is available from Springer via the DOI in ...
We analyze a game of strategic experimentation with two-armed bandits whose risky arm might yield pa...
This paper studies a game of strategic experimentation with two-armed bandits whose risky arm might ...
We analyze a two-player game of strategic experimentation with two-armed bandits. Each player has to...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We analyze a two-player game of strategic experimentation with two-armed bandits. Each player has to...
[This item is a preserved copy. To view the original, visit http://econtheory.org/] We stu...
This paper studies a game of strategic experimentation with two-armed bandits whose risky arm might ...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We study a class of symmetric strategic experimentation games. Each of two players faces an (exponen...
This article considers a class of experimentation games with Lévy bandits encompassing those of Bolt...
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes ...
We analyze a two-player game of strategic experimentation with two-armed bandits. Each player has to...