Home

bits

Bits are the fundamental unit of information in information theory and digital computing. A bit represents one of two possible states, typically 0 or 1. The term bit is a contraction of binary digit and was coined by John Tukey in the late 1940s. In information theory, a single bit denotes the amount of information required to distinguish between two equally likely alternatives.

In practical terms, bits quantify data size and transfer rates. Data storage and communication are measured

The physical realization of a bit relies on binary states of a system, such as voltage levels

in
bits
or
multiples
such
as
kilobits,
megabits,
and
gigabits.
Data
rates
are
commonly
described
in
bits
per
second
(bps).
Hardware
and
software
describe
processing
capacity
and
data
paths
in
terms
of
how
many
bits
are
moved
or
processed
per
operation,
and
eight
bits
form
a
byte,
a
more
common
unit
for
expressing
memory
size
and
file
lengths.
in
electronics
or
magnetic
orientations
in
storage
media.
Because
information
content
depends
on
probability,
the
theoretical
framework
behind
bits
uses
logarithms
(base
2),
with
a
bit
representing
the
information
gained
when
reducing
uncertainty
by
half
in
the
case
of
two
equiprobable
outcomes.
While
eight
bits
equal
one
byte
in
standard
usage,
prefixes
for
data
quantity
can
reflect
decimal
or
binary
bases,
leading
to
differences
in
advertised
versus
actual
storage
capacity
in
some
contexts.