Candidate:
Sun
Sheng
Gu
Title:
XC:
Exploring
Quantitative
Use
Cases
for
Explanations
in
3D
Object
Detection
Date:
December
22,
2021
Time:
13:00
Place:
online
Supervisor(s):
Czarnecki,
Krzysztof
Abstract:
Explainable
AI
(XAI)
methods
are
frequently
applied
to
obtain
qualitative
insights
about
deep
models’
predictions.
However,
such
insights
need
to
be
interpreted
by
a
human
observer
to
be
useful.
In
this
thesis,
we
aim
to
use
explanations
directly
to
make
decisions
without
human
observers.
We
adopt
two
gradient-based
explanation
methods,
Integrated
Gradients
(IG)
and
backprop,
for
the
task
of
3D
object
detection.
Then,
we
propose
a
set
of
quantitative
measures,
named
Explanation
Concentration
(XC)
scores,
that
can
be
used
for
downstream
tasks.
These
scores
quantify
the
concentration
of
attributions
within
the
boundaries
of
detected
objects.
We
evaluate
the
effectiveness
of
XC
scores
via
the
task
of
distinguishing
true
positive
(TP)
and
false
positive
(FP)
detected
objects
in
the
KITTI
and
Waymo
datasets.
The
results
demonstrate
improvement
of
more
than
100%
on
both
datasets
compared
to
other
heuristics
such
as
random
guesses
and
number
of
LiDAR
points
in
bounding
box,
raising
confidence
in
XC’s
potential
for
application
in
more
use
cases.
Our
results
also
indicate
that
computationally
expensive
XAI
methods
like
IG
may
not
be
more
valuable
when
used
quantitatively
compared
to
simpler
methods.
Moreover,
we
apply
loss
terms
based
on
XC
and
pixel
attribution
prior
(PAP),
which
is
another
qualitative
measure
for
attributions,
to
the
task
of
training
a
3D
object
detection
model.
We
show
that
performance
boost
is
possible
as
long
as
we
select
the
right
subset
of
predictions
for
which
the
attribution-based
losses
are
applied.
Wednesday, December 22, 2021 1:00 pm
-
1:00 pm
EST (GMT -05:00)