ESL: Entropy-guided Self-supervised Learning for Domain Adaptation in Semantic Segmentation - CNRS - Centre national de la recherche scientifique Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

ESL: Entropy-guided Self-supervised Learning for Domain Adaptation in Semantic Segmentation

Résumé

While fully-supervised deep learning yields good models for urban scene semantic segmentation, these models struggle to generalize to new environments with different lighting or weather conditions for instance. In addition, producing the extensive pixel-level annotations that the task requires comes at a great cost. Unsupervised domain adaptation (UDA) is one approach that tries to address these issues in order to make such systems more scalable. In particular, self-supervised learning (SSL) has recently become an effective strategy for UDA in semantic segmentation. At the core of such methods lies `pseudo-labeling', that is, the practice of assigning high-confident class predictions as pseudo-labels, subsequently used as true labels, for target data. To collect pseudo-labels, previous works often rely on the highest softmax score, which we here argue as an unfavorable confidence measurement. In this work, we propose Entropy-guided Self-supervised Learning (ESL), leveraging entropy as the confidence indicator for producing more accurate pseudo-labels. On different UDA benchmarks, ESL consistently outperforms strong SSL baselines and achieves state-of-the-art results.

Dates et versions

hal-03482770 , version 1 (16-12-2021)

Identifiants

Citer

Antoine Saporta, Tuan-Hung Vu, Matthieu Cord, Patrick Pérez. ESL: Entropy-guided Self-supervised Learning for Domain Adaptation in Semantic Segmentation. Workshop on Scalability in Autonomous Driving at IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun 2020, Seattle, Washington (virtual), United States. ⟨hal-03482770⟩
70 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More