Finite-Dimensional Projection for Classification and Statistical Learning
Résumé
A new method for the binary classification problem is studied. It relies on empirical minimization of the hinge risk over an increasing sequence of finite-dimensional spaces. A suitable dimension is picked by minimizing the regularized risk, where the regularization term is proportional to the dimension. An oracle-type inequality is established for the excess generalization risk (i.e. regret to Bayes) of the procedure, which ensures adequate convergence properties of the method. We suggest to select the considered sequence of subspaces by applying kernel principal components analysis. In this case the asymptotical convergence rate of the method can be better than what is known for the Support Vector Machine. Exemplary experiments are presented on benchmark datasets where the practical results of the method are comparable to the SVM.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...