Home

Latentie

Latentie, also known as latency, refers to the delay between a cause and its observable effect in a system. It is typically measured in units of time, such as milliseconds, and is a key metric for responsiveness. Latentie can be described as end-to-end or one-way, with end-to-end latency often including a return path that yields the round-trip time (RTT). Jitter describes the variation in latency over time.

In telecommunications and computer networks, latency is composed of several components: propagation delay, transmission delay, processing

In computer systems, latency encompasses memory latency (time to access data in memory), storage latency (disk

Measurement and mitigation: common tools include ping for round-trip time and traceroute or MTR for route inspection.

Applications and relevance: latency affects the perceived immediacy of responses in real-time communication, immersive media, and

delay,
and
queuing
delay.
Propagation
delay
depends
on
the
physical
distance
and
the
speed
of
the
signal
through
the
medium.
Transmission
delay
relates
to
packet
size
and
link
bandwidth.
Processing
delay
arises
from
the
time
devices
take
to
examine
and
forward
packets.
Queuing
delay
results
from
traffic
congestion
and
buffering.
The
sum
of
these
delays
gives
the
end-to-end
latency
between
sender
and
receiver.
or
flash
access),
CPU
and
GPU
processing
delays,
and
input/output
delays.
Low
latency
is
crucial
for
interactive
applications
such
as
gaming,
trading
platforms,
and
real-time
analytics,
while
higher
latency
can
degrade
user
experience
and
system
performance.
Reducing
latency
can
involve
upgrading
network
links,
deploying
content
delivery
networks,
using
edge
computing
and
caching,
optimizing
software
protocols,
and
reducing
processing
overhead.
Techniques
such
as
asynchronous
I/O,
parallel
processing,
and
quality
of
service
prioritization
also
help
lower
effective
latency.
interactive
services,
shaping
design
choices
across
networks,
computing
infrastructure,
and
application
software.