نبذة مختصرة : This research has been partially supported by Spanish Ministerio de Ciencia e Innovacion, Agencia Estatal de Investigacion/FEDER grant number PID2020-114594GBC21, Junta de Andalucia projects P18-FR-1422, P18-FR-2369 and projects FEDERUS-1256951, BFQM-322-UGR20, CEI-3-FQM331 and NetmeetData-Ayudas Fundacion BBVA a equipos de investigacion cientifica 2019. The first author was also partially supported by the IMAG-Maria de Maeztu grant CEX2020-001105-M /AEI /10.13039/501100011033. ; In this paper we propose a novel methodology to construct Optimal Classification Trees that takes into account that noisy labels may occur in the training sample. The motivation of this new methodology is based on the superaditive effect of combining together margin based classifiers and outlier detection techniques. Our approach rests on two main elements: (1) the splitting rules for the classification trees are designed to maximize the separation margin between classes applying the paradigm of SVM; and (2) some of the labels of the training sample are allowed to be changed during the construction of the tree trying to detect the label noise. Both features are considered and integrated together to design the resulting Optimal Classification Tree.We present a Mixed Integer Non Linear Programming formulation for the problem, suitable to be solved using any of the available off-the-shelf solvers. The model is analyzed and tested on a battery of standard datasets taken from UCI Machine Learning repository, showing the effectiveness of our approach. Our computational results show that in most cases the new methodology outperforms both in accuracy and AUC the results of the benchmarks provided by OCT and OCT-H. ; Spanish Ministerio de Ciencia e Innovacion, Agencia Estatal de Investigacion/FEDER PID2020-114594GBC21 ; Junta de Andalucia P18-FR-1422 P18-FR-2369 ; NetmeetData-Ayudas Fundacion BBVA a equipos de investigacion cientifica 2019 ; IMAG-Maria de Maeztu CEX2020-001105-M /AEI /10.13039/501100011033 ; FEDERUS-1256951 BFQM-322-UGR20 ...
No Comments.