Estimating Mutual information for feature selection in the presence of label noise

Benoît Frénay, Gauthier Doquire, Michel Verleysen

Research output: Contribution to journalArticlepeer-review

Abstract

A way to achieve feature selection for classification problems polluted by label noise is proposed. The performances of traditional feature selection algorithms often decrease sharply when some samples are wrongly labelled. A method based on a probabilistic label noise model combined with a nearest neighbours-based entropy estimator is introduced to robustly evaluate the mutual information, a popular relevance criterion for feature selection. A backward greedy search procedure is used in combination with this criterion to find relevant sets of features. Experiments establish that (i) there is a real need to take a possible label noise into account when selecting features and (ii) the proposed methodology is effectively able to reduce the negative impact of the mislabelled data points on the feature selection process.
Original languageEnglish
Pages (from-to)832-848
Number of pages17
JournalComputational Statistics and Data Analysis
Volume71
DOIs
Publication statusPublished - Mar 2014
Externally publishedYes

Keywords

  • Entropy estimation
  • Feature selection
  • Label noise
  • Mutual information

Fingerprint

Dive into the research topics of 'Estimating Mutual information for feature selection in the presence of label noise'. Together they form a unique fingerprint.

Cite this