Parameter-insensitive kernel in extreme learning for non-linear support vector regression

Benoît Frénay, Michel Verleysen

Research output: Contribution to journalArticle

Abstract

Support vector regression (SVR) is a state-of-the-art method for regression which uses the εsensitive loss and produces sparse models. However, non-linear SVRs are difficult to tune because of the additional kernel parameter. In this paper, a new parameter-insensitive kernel inspired from extreme learning is used for non-linear SVR. Hence, the practitioner has only two meta-parameters to optimise. The proposed approach reduces significantly the computational complexity yet experiments show that it yields performances that are very close from the state-of-the-art. Unlike previous works which rely on Monte-Carlo approximation to estimate the kernel, this work also shows that the proposed kernel has an analytic form which is computationally easier to evaluate. © 2011 Elsevier B.V.

Original languageEnglish
Pages (from-to)2526-2531
Number of pages6
JournalNeurocomputing
Volume74
Issue number16
DOIs
Publication statusPublished - Sep 2011
Externally publishedYes

Keywords

  • ELM kernel
  • Extreme learning machine
  • Infinite number of neurons
  • Support vector regression

Fingerprint Dive into the research topics of 'Parameter-insensitive kernel in extreme learning for non-linear support vector regression'. Together they form a unique fingerprint.

  • Cite this