Is mutual information adequate for feature selection in regression?

Benoît Frénay, Gauthier Doquire, Michel Verleysen

Résultats de recherche: Contribution à un journal/une revueArticleRevue par des pairs

Résumé

Feature selection is an important preprocessing step for many high-dimensional regression problems. One of the most common strategies is to select a relevant feature subset based on the mutual information criterion. However, no connection has been established yet between the use of mutual information and a regression error criterion in the machine learning literature. This is obviously an important lack, since minimising such a criterion is eventually the objective one is interested in. This paper demonstrates that under some reasonable assumptions, features selected with the mutual information criterion are the ones minimising the mean squared error and the mean absolute error. On the contrary, it is also shown that the mutual information criterion can fail in selecting optimal features in some situations that we characterise. The theoretical developments presented in this work are expected to lead in practice to a critical and efficient use of the mutual information for feature selection. © 2013 Elsevier Ltd.

langue originaleAnglais
Pages (de - à)1-7
Nombre de pages7
journalNeural Networks
Volume48
Les DOIs
Etat de la publicationPublié - déc. 2013
Modification externeOui

Empreinte digitale

Examiner les sujets de recherche de « Is mutual information adequate for feature selection in regression? ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation