%A P. S. Bradley & O. L. mangasarian
%T Feature Selection via Concave Minimization and Support Vector Machines
%D February 1998
%R 98-03
%I COMPUTER SCIENCES DEPARTMENT, UNIVERSITY OF WISCONSIN
%C MADISON, WI
%X
Computational comparison is made between two feature selection approaches for
finding a separating plane that discriminates between two point sets
in an $n$-dimensional feature space selecting
as few of the $n$ features
(dimensions) as possible. In the concave minimization approach
\cite{man:95d,bms:95} a separating plane is generated by minimizing
a weighted sum of distances of
misclassified points to two parallel planes that bound the sets and
which determine the separating plane midway between them.
Furthermore, the number of dimensions of the space used to determine
the plane is minimized.
In the support vector machine approach \cite{vap:95,bb:97,fg:97,wahba:97},
in addition to minimizing the weighted sum of distances of
misclassified points to the bounding planes, we also {\em maximize}
the distance between the two bounding planes that generate the
separating plane.
Computational results show that feature suppression is an indirect
consequence of the support vector machine approach when an appropriate
norm is used.
Numerical tests on 6 public data sets show that classifiers trained by
the concave minimization approach and those trained by a support vector
machine have comparable 10-fold cross-validation correctness.
However, in all data sets tested, the
classifiers obtained by the concave minimization approach selected
fewer problem features than those trained by a support vector
machine.