Menu

Some Aspects of Learning Rates for SVMs

calendar icon Feb 25, 2007 5774 views
video thumbnail
Pause
Mute
speed icon
speed icon
0.25
0.5
0.75
1
1.25
1.5
1.75
2

We present some learning rates for support vector machine classification. In particular we discuss a recently proposed geometric noise assumption which allows to bound the approximation error for Gaussian RKHSs. Furthermore we show how a noise assumption proposed by Tsybakov can be used to obtain learning rates between 1/sqrt(n) and 1/n. Finally, we describe the influence of the approximation error on the overall learning rate.

RELATED CATEGORIES

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.