Feature selection for nonlinear models with extreme learning machines

Benoît Frénay, Mark van Heeswijk, Yoan Miche, Michel Verleysen, Amaury Lendasse

Résultats de recherche: Contribution à un journal/une revueArticleRevue par des pairs

Résumé

In the context of feature selection,there is a trade-off between the number of selected features and the generalisation error.Two plots may help to summarise feature selection:the feature selection path and the sparsity-error trade-off curve.The feature selection path shows the best feature subset for each subset size,whereas the sparsity-errortrade-off curve shows the corresponding generalisation errors. These graphical tools may help experts to choose suitable feature subsets and extract useful domain knowledge.In order to obtain these tools,extreme learning machines are used here,since they are fast to train and an estimate of their generalisation error can easily be obtained using the PRESS statistics. An algorithm is introduced,which adds an additional layer to standard extreme learning machines in order to optimise the subset of selected features.Experimental results illustrate the quality of the presented method.
langue originaleAnglais
Pages (de - à)111-124
Nombre de pages14
journalNeurocomputing
Volume102
Les DOIs
Etat de la publicationPublié - 2013
Modification externeOui

Empreinte digitale

Examiner les sujets de recherche de « Feature selection for nonlinear models with extreme learning machines ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation