On the influence of the kernel on the consistency of support vector machines.

*(English)*Zbl 1009.68143Summary: We study the generalization abilities of several classifiers of Support Vector Machine (SVM) type using a certain class of kernels that we call universal. It is shown that the soft margin algorithms with universal kernels are consistent for a large class of classification problems including some kind of noisy tasks provided that the regularization parameter is chosen well. In particular we derive a simple sufficient condition for this parameter in the case of Gaussian RBF kernels. On the one hand our considerations are based on an investigation of an approximation property – the so-called universality – of the used kernels that ensures that all continuous functions can be approximated by certain kernel expressions. This approximation property also gives a new insight into the role of kernels in these and other algorithms. On the other hand the results are achieved by a precise study of the underlying optimization problems of the classifiers. Furthermore, we show consistency for the maximal margin classifier as well as for the soft margin SVM’s in the presence of large margins. In this case it turns out that also constant regularization parameters ensure consistency for the soft margin SVM’s. Finally we prove that even for simple, noise free classification problems SVM’s with polynomial kernels can behave arbitrarily badly.

##### MSC:

68T10 | Pattern recognition, speech recognition |