Some Aspects of Learning Rates for SVMs

Posted in Science on August 19, 2008

Some Aspects of Learning Rates for SVMs

We present some learning rates for support vector machine classification. In particular we discuss a recently proposed geometric noise assumption which allows to bound the approximation error for Gaussian RKHSs. Furthermore we show how a noise assumption proposed by Tsybakov can be used to obtain learning rates between 1/sqrt(n) and 1/n. Finally, we describe the influence of the approximation error on the overall learning rate.

Author: Ingo Steinwart, Los Alamos National Laboratory

Watch Video

Tags: Science, Lectures, Computer Science, Machine Learning, VideoLectures.Net, Kernel Methods