Soft-max boosting

Verfasser / Beitragende:
[Matthieu Geist]
Ort, Verlag, Jahr:
2015
Enthalten in:
Machine Learning, 100/2-3(2015-09-01), 305-332
Format:
Artikel (online)
ID: 605478252
LEADER caa a22 4500
001 605478252
003 CHVBK
005 20210128100404.0
007 cr unu---uuuuu
008 210128e20150901xx s 000 0 eng
024 7 0 |a 10.1007/s10994-015-5491-2  |2 doi 
035 |a (NATIONALLICENCE)springer-10.1007/s10994-015-5491-2 
100 1 |a Geist  |D Matthieu  |u IMS - MaLIS Research Group and UMI 2958 (GeorgiaTech-CNRS), CentraleSupélec, Metz, France  |4 aut 
245 1 0 |a Soft-max boosting  |h [Elektronische Daten]  |c [Matthieu Geist] 
520 3 |a The standard multi-class classification risk, based on the binary loss, is rarely directly minimized. This is due to (1) the lack of convexity and (2) the lack of smoothness (and even continuity). The classic approach consists in minimizing instead a convex surrogate. In this paper, we propose to replace the usually considered deterministic decision rule by a stochastic one, which allows obtaining a smooth risk (generalizing the expected binary loss, and more generally the cost-sensitive loss). Practically, this (empirical) risk is minimized by performing a gradient descent in the function space linearly spanned by a base learner (a.k.a. boosting). We provide a convergence analysis of the resulting algorithm and experiment it on a bunch of synthetic and real-world data sets (with noiseless and noisy domains, compared to convex and non-convex boosters). 
540 |a The Author(s), 2015 
690 7 |a Multi-class classification  |2 nationallicence 
690 7 |a Boosting  |2 nationallicence 
690 7 |a Binary loss  |2 nationallicence 
690 7 |a Noise-tolerant learning  |2 nationallicence 
773 0 |t Machine Learning  |d Springer US; http://www.springer-ny.com  |g 100/2-3(2015-09-01), 305-332  |x 0885-6125  |q 100:2-3<305  |1 2015  |2 100  |o 10994 
856 4 0 |u https://doi.org/10.1007/s10994-015-5491-2  |q text/html  |z Onlinezugriff via DOI 
898 |a BK010053  |b XK010053  |c XK010000 
900 7 |a Metadata rights reserved  |b Springer special CC-BY-NC licence  |2 nationallicence 
908 |D 1  |a research-article  |2 jats 
949 |B NATIONALLICENCE  |F NATIONALLICENCE  |b NL-springer 
950 |B NATIONALLICENCE  |P 856  |E 40  |u https://doi.org/10.1007/s10994-015-5491-2  |q text/html  |z Onlinezugriff via DOI 
950 |B NATIONALLICENCE  |P 100  |E 1-  |a Geist  |D Matthieu  |u IMS - MaLIS Research Group and UMI 2958 (GeorgiaTech-CNRS), CentraleSupélec, Metz, France  |4 aut 
950 |B NATIONALLICENCE  |P 773  |E 0-  |t Machine Learning  |d Springer US; http://www.springer-ny.com  |g 100/2-3(2015-09-01), 305-332  |x 0885-6125  |q 100:2-3<305  |1 2015  |2 100  |o 10994