On Sequence Kernels for SVM classification of sets of vectors

Posted in Conferences, Companies, Science on May 04, 2007

On Sequence Kernels for SVM classification of sets of vectors

Google Tech Talks
April 10, 2007


Support Vector Machines (SVMs) have become one of the most popular tools for discriminative classification of static data. However, research in SVM classification of dynamic (continuous) data has gained in interest only recently. In this presentation, I first give an overview of existing sequence kernels for classification of sets of vectors. I then present a new family of sequence kernels that generalizes the Generalized Linear Discriminant Sequence (GLDS) kernel. As opposed to GLDS, the new sequence kernels allow implicit normalized expansions in a high/infinite-dimensional feature space (FS). Moreover, they induce a Mahalanobis distance in the FS which makes them kernels between distributions in this FS. The exact form of the new sequence kernels involves the computation of the Gram matrix on training data, and may thus be intractable in large scale problems. To overcome this, we use a low-rank approximation of the Gram matrix to provide an approximate but tractable form.

We use a NIST speaker verification task as a vehicle to demonstrate the effectiveness of the new kernels. The results show that the new SVM system outperforms existing ones and yields competitive performance with the state-of-the-art (generative) UBM-GMM system. The fusion of both improves the results, which shows the complemetarity of the two approaches.

Watch Video

Tags: Techtalks, Google, Conferences, Science, Lectures, Statistics, Math, Computer Science, Broadcasting, Companies