Home

DIoU

DIoU, short for Distance-IoU, is a bounding-box regression loss used in object detection to improve localization accuracy. It extends the traditional IoU-based loss by incorporating the distance between the centers of the predicted and ground-truth boxes into the objective, encouraging faster and more precise alignment of boxes even when there is substantial overlap.

Mathematically, let Bp be the predicted axis-aligned box with center (xp, yp) and ground-truth box Bg have

DIoU is differentiable and can be implemented as a drop-in replacement for IoU-based regression losses in many

---

center
(xg,
yg).
Let
IoU
denote
the
intersection-over-union
between
Bp
and
Bg.
Let
d^2
be
the
squared
Euclidean
distance
between
the
centers,
and
let
c^2
be
the
squared
diagonal
length
of
the
smallest
enclosing
box
that
contains
both
Bp
and
Bg.
Then
the
DIoU
value
is
defined
as
DIoU
=
IoU
−
d^2
/
c^2,
and
the
corresponding
loss
used
for
training
is
L_DIoU
=
1
−
DIoU
=
1
−
IoU
+
d^2
/
c^2.
This
formulation
adds
a
center-distance
penalty
to
the
IoU
objective,
pushing
predicted
boxes
toward
the
ground-truth
centers.
detectors,
including
anchor-based
and
some
anchor-free
architectures.
It
is
typically
used
alongside
a
classification
loss
in
fully
supervised
object
detection
pipelines.
Variants
like
CIOU
(Complete
IoU)
extend
DIoU
by
incorporating
aspect-ratio
information
to
further
improve
regression
accuracy.
DIoU
is
valued
for
improving
convergence
speed
and
localization
precision,
particularly
when
boxes
overlap
but
are
offset.