Home

Bit

A bit, short for binary digit, is the basic unit of information in computing and digital communications. It has a binary value of either 0 or 1 and represents a choice between two distinct states. In information theory, a bit measures the amount of information needed to distinguish between two equally likely possibilities, quantifying uncertainty and information content.

A group of eight bits forms a byte, which is the standard unit used to express storage

Historically, the term bit was coined by John Tukey in 1947 as a contraction of binary digit,

Bits are manipulated by digital logic and stored in memory as sequences that encode numbers, characters, instructions,

and
data
size
in
many
computer
systems.
Data
transfer
rates
are
typically
expressed
in
bits
per
second
(bps),
while
storage
is
described
in
bytes
and
their
multiples.
To
avoid
ambiguity,
binary
prefixes
such
as
kibibit
(Kibit)
and
mebibit
(Mibit)
are
used
for
bits,
whereas
decimal
prefixes
like
kilobit
(kbit)
and
megabit
(Mbit)
are
common
in
networking
with
decimal
values.
and
its
central
role
in
information
theory
was
popularized
by
Claude
E.
Shannon
in
1948.
The
bit
underpins
modern
digital
technology,
governing
how
information
is
encoded,
transmitted,
and
processed.
and
multimedia.
They
enable
error
detection
and
correction,
data
compression,
encryption,
and
various
encoding
schemes.
In
practice,
the
practical
choice
between
bits
and
bytes
depends
on
context:
hardware
and
networks
often
use
bits,
while
software
and
storage
use
bytes
and
higher-order
units.