Most of the existing salient object detection (SOD) models focus on improving the overall model performance, without explicitly explaining the discrepancy between the training and testing distributions. In this paper, we investigate a particular type of epistemic uncertainty, namely distributional uncertainty, for salient object detection. Specifically, for the first time, we explore the existing class-aware distribution gap exploration techniques, i.e. long-tail learning, single-model uncertainty modeling and test-time strategies, and adapt them to model the distributional uncertainty for our class-agnostic task. We define test sample that is dissimilar to the training dataset as being “out-of-distribution” (OOD) samples. Different from the conventional OOD definition, where OOD samples are those not belonging to the closed-world training categories, OOD samples for SOD are those break the basic priors of saliency, i.e. center prior, color contrast prior, compactness prior and etc., indicating OOD as being “continuous” instead of being discrete for our task. We’ve carried out extensive experimental results to verify effectiveness of existing distribution gap modeling techniques for SOD, and conclude that both train-time single-model uncertainty estimation techniques and weight-regularization solutions that preventing model activation from drifting too much are promising directions for modeling distributional uncertainty for SOD.
Visualization of different types of uncertainty, where aleatoric uncertainty
“OOD” samples for salient object detection. Different from the class-aware tasks, OOD for saliency detection is continuous, which can be defined as attributes that break the basic saliency priors, i.e. center prior, contrast prior, compactness prior, etc. We aim to explore distributional uncertainty estimation for saliency detection.
Pytorch 1.10.0
Torchvision 0.11.1
Cuda 11.4
we use DUTS training dataset to train our models, and use DUTS-test, ECSSD and DUT datasets for evaluation.
We summarize many methods (MCDropout, DeepEnsemble, TALT, NorCal, WB, MCP, Energy, SML, GradNorm, ExGrad, TCP, DC, ReAct, CoTTA, CTTA,) and apply them to the SOD task to model distributional uncertainty. Thanks for these awesome and meaningful works.