Home

flop

FLOP stands for floating point operation, a basic arithmetic operation performed on numbers represented in floating point format. In computing, floating point operations include addition, subtraction, multiplication, and division, as well as more complex functions expressed in floating point arithmetic. The term is used as a unit of computational performance for hardware and software, especially in scientific computing and benchmarks.

Performance is expressed in FLOPS, floating point operations per second. Common prefixes denote scale: kiloFLOPS (10^3),

FLOPS-based measures have been central to high-performance computing since the 1960s. They are used to compare

Limitations: FLOPS do not capture memory bandwidth, latency, software efficiency, or algorithmic complexity. Real-world performance depends

megaFLOPS
(10^6),
gigaFLOPS
(10^9),
teraFLOPS
(10^12),
petaFLOPS
(10^15),
and
exaFLOPS
(10^18).
A
fused
multiply-add
(FMA)
operation
is
typically
counted
as
two
FLOPs
(one
multiply
and
one
add)
in
most
benchmarks,
though
some
methodologies
treat
an
FMA
as
a
single
operation.
CPUs,
GPUs,
and
supercomputers,
informally
ranking
systems.
Benchmarks
such
as
LINPACK,
High-Performance
Linpack
(HPL),
and
other
suites
report
FLOPS
to
indicate
computational
capability;
exascale
systems
target
around
10^18
FLOPS.
on
many
factors,
including
memory
hierarchy,
parallelization,
and
numerical
stability.
Consequently,
FLOPS
are
best
viewed
as
one
metric
among
several
when
evaluating
a
system
or
algorithm.