Menu

Model Uncertainty in Sequential Decision Making

video thumbnail
Pause
Mute
speed icon
speed icon
0.25
0.5
0.75
1
1.25
1.5
1.75
2

The deployment of autonomous systems in the real world presents opportunities to combine model-based reasoning with data-driven models. To approach this in a systematic way, one must acknowledge the inherent inaccuracies and underspecification of data-driven models. Therefore, understanding and reasoning about model uncertainty is crucial. This reasoning can be qualitative, in the form of policies which are robust to variations in the model, or quantitative, in the form of policies that operate on the belief space of models. This introductory course will focus on understanding and reasoning about model uncertainty. We will introduce several extensions of the standard Markov decision process (MDP) framework for planning under uncertainty that enable reasoning over model uncertainty. We will begin by providing the basics of planning for MDPs, and then use the context of long-term autonomy to demonstrate the need for reasoning about model uncertainty. We will then cover three extensions of MDPs designed to do so: interval MDPs, uncertain MDPs, and Bayes-adaptive MDPs.

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.