Menu

On the relation between Bayesian inference and certain solvable problems of stochastic control

calendar icon Oct 9, 2008 4690 views
video thumbnail
Pause
Mute
speed icon
speed icon
0.25
0.5
0.75
1
1.25
1.5
1.75
2

Optimal control for nonlinear stochastic dynamical systems requires thesolution of a nonlinear PDE, the so - called Hamilton Jacobi Bellman equation.Recently, Bert Kappen and Emanuel Todorov have shown that for certain types of cost functions, this equationcan be transformed to a linear problem which is mathematically related to a Bayesian estimation problem. This has led to novel efficient algorithms for optimal control of such systems. I will show a simple proof for this surprising result and discuss some possible implications.

RELATED CATEGORIES

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.