Feature selection for nonlinear models with extreme learning machines

Benoît Frénay, Mark van Heeswijk, Yoan Miche, Michel Verleysen, Amaury Lendasse

Research output: Contribution to journalArticlepeer-review

Abstract

In the context of feature selection,there is a trade-off between the number of selected features and the generalisation error.Two plots may help to summarise feature selection:the feature selection path and the sparsity-error trade-off curve.The feature selection path shows the best feature subset for each subset size,whereas the sparsity-errortrade-off curve shows the corresponding generalisation errors. These graphical tools may help experts to choose suitable feature subsets and extract useful domain knowledge.In order to obtain these tools,extreme learning machines are used here,since they are fast to train and an estimate of their generalisation error can easily be obtained using the PRESS statistics. An algorithm is introduced,which adds an additional layer to standard extreme learning machines in order to optimise the subset of selected features.Experimental results illustrate the quality of the presented method.
Original languageEnglish
Pages (from-to)111-124
Number of pages14
JournalNeurocomputing
Volume102
DOIs
Publication statusPublished - 2013
Externally publishedYes

Keywords

  • Extreme learning machine Regression Regularisation Feature selection ICTEAM:MLAI

Fingerprint

Dive into the research topics of 'Feature selection for nonlinear models with extreme learning machines'. Together they form a unique fingerprint.

Cite this