Non-Convex Potential Function Boosting Versus Noise Peeling: - A Comparative Study
Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
In recent decades, boosting methods have emerged as one of the leading ensemble learning techniques. Among the most popular boosting algorithm is AdaBoost, a highly influential algorithm that has been noted for its excellent performance in many tasks. One of the most explored weaknesses of AdaBoost and many other boosting algorithms is that they tend to overfit to label noise, and consequently several alterative algorithms that are more robust have been proposed. Among boosting algorithms which aim to accommodate noisy instances, the non-convex potential function optimizing RobustBoost algorithm has gained popularity by a recent result stating that all convex potential boosters can be misled by random noise. Contrasting this approach, Martinez and Gray (2016) propose a simple but reportedly effective way of remedying the noise problems inherent in the traditional AdaBoost algorithm by introducing peeling strategies in relation to boosting. This thesis evaluates the robustness of these two alternatives on empirical and synthetic data sets in the case of binary classification. The results indicate that the two methods are able to enhance the robustness compared to traditional convex potential function boosting algorithms, but not to a significant extent.
Place, publisher, year, edition, pages
2016. , 21 p.
Probability Theory and Statistics
IdentifiersURN: urn:nbn:se:uu:diva-302289OAI: oai:DiVA.org:uu-302289DiVA: diva2:956975
Subject / course
Master Programme in Statistics