Home

timedelta

A timedelta is a duration representing the difference between two dates or times. In programming contexts, it denotes a length of time rather than a specific moment. Timedeltas are commonly used to compute future or past datetimes, or to measure the length of an interval.

In Python, timedelta is a class in the datetime module. It can be created by specifying days,

Timedeltas support arithmetic operations. They can be added to or subtracted from other timedeltas or from

Key properties and methods include the components days, seconds, and microseconds, which break down the duration.

Use cases include scheduling future moments, measuring elapsed time for operations, and computing offsets from a

seconds,
microseconds,
and
their
larger
units
such
as
weeks,
minutes,
hours,
and
milliseconds.
All
parameters
are
optional
and
default
to
zero.
Example:
timedelta(days=5,
hours=3,
minutes=20).
Internally,
a
timedelta
stores
its
value
as
days,
seconds,
and
microseconds.
datetime
objects.
Subtracting
two
datetimes
yields
a
timedelta;
adding
a
timedelta
to
a
datetime
yields
a
new
datetime.
Timedeltas
can
be
scaled
by
multiplying
or
dividing
by
a
number,
effectively
stretching
or
shrinking
the
duration.
They
can
be
compared
to
other
timedeltas,
and
they
may
be
negative
if
the
interval
is
inverted.
The
total_seconds()
method
returns
the
entire
duration
expressed
as
a
float
number
of
seconds,
including
any
fractional
part
from
microseconds.
given
timestamp.
When
dealing
with
time
zones,
ensure
consistent
use
of
naive
versus
aware
datetimes,
as
mixing
them
can
lead
to
incorrect
results
even
though
the
timedelta
itself
represents
only
a
duration.
Timedeltas
are
a
core
tool
for
duration
arithmetic
in
many
languages,
with
Python’s
datetime
module
providing
a
widely
used
implementation.