Some Aspects of Learning Rates for SVMs
en-es
en-fr
en-sl
en
en-zh
en-de
0.25
0.5
0.75
1.25
1.5
1.75
2
We present some learning rates for support vector machine classification. In particular we discuss a recently proposed geometric noise assumption which allows to bound the approximation error for Gaussian RKHSs. Furthermore we show how a noise assumption proposed by Tsybakov can be used to obtain learning rates between 1/sqrt(n) and 1/n. Finally, we describe the influence of the approximation error on the overall learning rate.