Menu

Distributed Exploration in Multi-Armed Bandits

Published on 2014-11-071803 Views
video thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

We study exploration in Multi-Armed Bandits (MAB) in a setting where~k players collaborate in order to identify an ϵ-optimal arm. Our motivation comes from recent employment of MAB algorithms in compu

RELATED CATEGORIES

MORE VIDEOS FROM THE EVENT

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.