نبذة مختصرة : In machine learning, one of the problems with classification methods is that classifiers give too confident probabilities. The solution to the problem is calibration which performs a correction on the predicted probabilities. In this bachelor's thesis, the Dirichlet calibration method is analyzed. The change of the calibration matrix was studied through the classifier training process, its effect on the results at different training stages, and the nature of the elements of the calibration matrix was interpreted. The paper described how the calibration is performed with the Dirichlet calibration method and how the calibration matrix shows and improves the confidence of the classifier. The experiments were performed on deep neural networks with the architectures ResNet110, Wide ResNet32 and DenseNet40 classifiers and on the CIFAR-10 dataset. The analysis showed that the classifiers were over confident throughout the whole training process, and the Dirichlet calibration method improves confidence at each stage of the training process.
No Comments.