Éditeur(s) :
HAL CCSD IEEE Résumé : International audience
Recent works display that large scale image classification problems rule out computationally demanding methods. On such problems, simple approaches like
k-NN are affordable contenders, with still room space for statistical improvements under the algorithmic constraints. A recent work showed how to leverage
k-NN to yield a formal boosting algorithm. This method, however, has numerical issues that make it not suited for large scale problems. We propose here an Adaptive Newton-Raphson scheme to leverage
k-NN, N
3, which does not suffer these issues. We show that it is a boosting algorithm, with several key algorithmic and statistical properties. In particular, it may be sufficient to boost a subsample to reach desired bounds for the loss at hand in the boosting framework. Experiments are provided on the SUN, and Caltech databases. They confirm that boosting a subsample -- sometimes containing few examples only -- is sufficient to reach the convergence regime of N
3. Under such conditions, N
3 challenges the accuracy of contenders with lower computational cost and lower memory requirement.
IEEE International Workshop on Machine Learning for Signal Processing
Southampton, United Kingdom
hal-00959125
https://hal.archives-ouvertes.fr/hal-00959125 https://hal.archives-ouvertes.fr/hal-00959125/document https://hal.archives-ouvertes.fr/hal-00959125/file/BNNB_mlsp2013.pdf