Download PDFOpen PDF in browser

Behaviour of Sample Selection Techniques Under Explicit Regularization

EasyChair Preprint 6763

11 pagesDate: October 3, 2021

Abstract

There is a multitude of sample selection-based learning strategies that have been developed for learning with noisy labels. However, It has also been indicated in the literature that perhaps early stopping is better than fully training the model for getting better performance. It leads us to wonder about the behavior of the sample selection strategies under explicit regularization. To this end, we considered four of the most fundamental sample selection-based models MentorNet, Coteaching, Coteaching-plus and JoCor.We provide empirical results of applying explicit L2 regularization to the above-mentioned approaches. We also compared the results with a baseline - a vanilla CNN model trained with just regularization. We show that under explicit regularization, the pre-conceived ranking of the approaches might change. We also show several instances where the baseline was able to outperform some or all of the existing approaches. Moreover, we show that under explicit regularization, the performance gap between the approaches can also reduce.

Keyphrases: noisy labels, regularization, sample selection

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:6763,
  author    = {Lakshya},
  title     = {Behaviour of Sample Selection Techniques Under Explicit Regularization},
  howpublished = {EasyChair Preprint 6763},
  year      = {EasyChair, 2021}}
Download PDFOpen PDF in browser