Home

TickRate

Tick rate is the frequency at which a game or simulation updates its core state. Measured in hertz, it indicates how many times per second the game logic advances. This is distinct from the display refresh rate, which controls how often frames are drawn on screen.

In online multiplayer games, tick rate often refers to the server tick rate: how many times per

Clients render frames at their own frame rate and use techniques such as client-side prediction and interpolation

Tick rates are commonly fixed (for example, 20, 60, or 128 Hz) rather than tied to the

In summary, tick rate measures how often game logic updates per second and is a key factor

second
the
server
processes
input,
runs
physics,
and
synchronizes
world
state
with
clients.
A
higher
tick
rate
can
reduce
input
latency
and
improve
synchronization,
but
requires
more
CPU
processing
and
more
bandwidth
to
distribute
state
updates.
to
hide
network
latency.
The
server
remains
the
authoritative
source
of
truth.
When
tick
rate
is
low,
prediction
errors
can
become
noticeable,
leading
to
lag
or
desynchronization
between
players.
frame
rate.
Some
systems
allow
higher
tick
rates
on
dedicated
servers.
The
practical
maximum
depends
on
network
protocol,
server
hardware,
and
the
typical
round-trip
time
between
client
and
server,
as
well
as
the
amount
of
state
that
must
be
synchronized
each
tick.
in
online
responsiveness
and
synchronization.
It
is
not
the
same
as
frame
rate,
which
measures
rendering
frequency.