Menu

Minimum Error Entropy Principle for Learning

calendar icon Aug 26, 2013 4433 views
split view icon
video icon
presentation icon
video with chapters icon
video thumbnail
Pause
Mute
speed icon
speed icon
0.25
0.5
0.75
1
1.25
1.5
1.75
2

Information theoretical learning is inspired by introducing information theory ideas into a machine learning paradigm. Minimum error entropy is a principle of information theoretical learning and provides a family of supervised learning algorithms. It is a substitution of the classical least squares method when the noise is non-Gaussian. Its idea is to extract from data as much information as possible about the data generating systems by minimizing error entropies. In this talk we will discuss some minimum error entropy algorithms in a regression setting by minimizing empirical Renyi's entropy of order 2. Consistency results and learning rates are presented. In particular, some error estimates dealing with heavy-tailed noise will be given.

RELATED CATEGORIES

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.