TitleCalibrating Uncertainties in Object Localization Task
Publication TypeConference Paper
Year of Publication2018
AuthorsPhan, B. Truong, R. Salay, K. Czarnecki, V. Abdelzad, T. Denouden, and S. Vernekar
Conference NameThird workshop on Bayesian Deep Learning (NeurIPS 2018), Montréal, Canada.
Date Published12/2018

In many safety-critical applications such as autonomous driving and surgical robots, it is desirable to obtain prediction uncertainties from object detection modules to help support safe decision-making. Specifically, such modules need to estimate the probability of each predicted object in a given region and the confidence interval for its bounding box. While recent Bayesian deep learning methods provide a principled way to estimate this uncertainty, the estimates for the bounding boxes obtained using these methods are uncalibrated. In this paper, we address this problem for the single-object localization task by adapting an existing technique for calibrating regression models. We show, experimentally, that the resulting calibrated model obtains more reliable uncertainty estimates.

Related files: 

WISE Lab logo


We are looking for postdocs and graduate students interested in working on all aspects of autonomous driving.

For more information, visit Open positions.