Calibrating Uncertainties in Object Localization Task

Title Calibrating Uncertainties in Object Localization Task
Author
Abstract

In many safety-critical applications such as autonomous driving and surgical robots, it is desirable to obtain prediction uncertainties from object detection modules to help support safe decision-making. Specifically, such modules need to estimate the probability of each predicted object in a given region and the confidence interval for its bounding box. While recent Bayesian deep learning methods provide a principled way to estimate this uncertainty, the estimates for the bounding boxes obtained using these methods are uncalibrated. In this paper, we address this problem for the single-object localization task by adapting an existing technique for calibrating regression models. We show, experimentally, that the resulting calibrated model obtains more reliable uncertainty estimates.

Year of Publication
2018
Conference Name
Calibrating Uncertainties in Object Localization Task
Date Published
12/2018
Download citation