Home

XOR

XOR, short for exclusive OR, is a binary operation that takes two input values and returns true or 1 when exactly one of the inputs is true or 1. It is a fundamental operator in digital logic and computer science. In binary terms, XOR is often denoted by the symbol ⊕ or by the caret ^ in many programming languages.

The truth table of XOR for two binary inputs A and B is:

A ⊕ B yields 0 when A and B are both 0 or both 1, and yields 1

Algebraically, XOR is commutative and associative, and its identity element is 0. Each element is its own

XOR plays a central role in both hardware and software. In digital circuits it is implemented with

Applications include parity checking, error detection, data obfuscation, and cryptographic constructions such as the one-time pad,

when
A
and
B
differ.
Concretely:
0
⊕
0
=
0,
0
⊕
1
=
1,
1
⊕
0
=
1,
1
⊕
1
=
0.
inverse
under
XOR,
since
A
⊕
A
=
0.
The
operation
is
equivalent
to
addition
modulo
2,
and
it
forms
the
addition
operation
in
the
finite
field
GF(2).
When
applied
to
bitstrings,
XOR
performs
a
bitwise
operation:
corresponding
bits
are
XORed
independently.
XOR
gates
and
is
essential
in
arithmetic
circuits,
parity
generation,
and
error
detection.
In
software,
XOR
is
available
as
a
bitwise
operator
for
integers
and
as
a
logical
operator
for
booleans
in
some
languages.
It
is
valued
for
its
simplicity,
speed,
and
the
property
that
XORing
a
value
with
a
key
and
then
XORing
the
result
with
the
same
key
returns
the
original
value.
where
XOR’s
reversibility
guarantees
that
encryption
and
decryption
are
identical
operations.