Adaptive Dimension Reduction Using Discriminant Analysis and K-means Clustering

Posted in Science on October 20, 2008


Adaptive Dimension Reduction Using Discriminant Analysis and K-means Clustering

Regularized Kernel Discriminant Analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. The performance of RKDA depends on the selection of kernels. In this paper, we consider the problem of learning an optimal kernel over a convex set of kernels. We show that the kernel learning problem can be formulated as a semidefinite program (SDP) in the binary-class case. We further extend the SDP formulation to the multi-class case. It is based on a key result established in this paper, that is, the multi-class kernel learning problem can be decomposed into a set of binary-class kernel learning problems. In addition, we propose an approximation scheme to reduce the computational complexity of the multi-class SDP formulation. The performance of RKDA also depends on the value of the regularization parameter. We show that this value can be learned automatically in the framework. Experimental results on benchmark data sets demonstrate the efficacy of the proposed SDP formulations.

Author: Shuiwang Ji, The Biodesign Institute, Arizona State University

Watch Video

Tags: Science, Lectures, Computer Science, Clustering, Machine Learning, VideoLectures.Net, Preprocessing