Improving the Feature Selection Stability of the Delta Test in Regression

Research output: Contribution to journalArticlepeer-review

Abstract

Feature selection is an important preprocessing step that helps to improve model performance and to extract knowledge about important features in a dataset. However, feature selection methods are known to be adversely impacted by changes in the training dataset: even small differences between input datasets can result in the selection of different feature sets. This paper tackles this issue in the particular case of the delta test, a well-known feature relevance criterion that approximates the noise variance for regression tasks. A new feature selection criterion is proposed, the delta test bar, which is shown to be more stable than its close competitors.

Original languageEnglish
Pages (from-to)1-7
Number of pages7
JournalIEEE Transactions on Artificial Intelligence
DOIs
Publication statusPublished - 12 Sept 2023

Keywords

  • Artificial intelligence
  • Bars
  • Classification and regression
  • Feature extraction
  • interpretable machine learning
  • Numerical stability
  • Stability criteria
  • Standards
  • Training
  • trustworthy artificial intelligence

Fingerprint

Dive into the research topics of 'Improving the Feature Selection Stability of the Delta Test in Regression'. Together they form a unique fingerprint.

Cite this