Home

Titrimetry

Titrimetry, or titration, is a quantitative chemical analysis in which a solution of known concentration, the titrant, reacts with a solution of unknown concentration, the analyte. The number of milliliters of titrant required to reach the reaction’s completion is used, with the reaction’s stoichiometry, to calculate the amount of analyte present.

The point at which the reaction is complete is called the end point; the equivalence point is

Common types include acid-base titrations, redox titrations, complexometric titrations, and precipitation titrations. In acid-base titrations, indicators

Standardization of the titrant against a primary standard ensures accuracy; calculations follow from the volume at

Applications include determining acids and bases in pharmaceuticals, environmental analysis, water hardness, food chemistry, and clinical

the
theoretical
point
of
exact
stoichiometric
balance.
In
practice
end
points
are
detected
by
indicators
that
change
color,
by
pH
or
voltage
measurements
with
a
meter,
or
by
conducting
the
titration
and
observing
a
change
in
analytical
signal.
such
as
phenolphthalein
or
methyl
orange
are
used,
or
a
pH
meter
can
determine
the
endpoint.
Redox
titrations
use
oxidizing
or
reducing
agents,
for
example
permanganate
or
iodine
solutions.
Complexometric
titrations,
notably
with
EDTA,
determine
metal
ions
by
forming
stable
complexes.
Precipitation
titrations
involve
a
reaction
that
forms
a
precipitate,
often
detected
by
turbidity
or
by
cessation
of
reaction
with
a
silver
salt.
Non-aqueous
titrations
use
solvents
other
than
water
and
can
extend
titrimetry
to
non-aqueous
samples.
endpoint
and
the
stoichiometry.
Modern
titration
often
employs
automatic
titrators
and
potentiometric
detection,
which
can
improve
precision
and
reduce
operator
bias.
chemistry.
Historically,
titrimetry
developed
in
the
19th
century
with
methods
introduced
by
Karl
Friedrich
Mohr
and
others
and
remains
a
fundamental
technique
in
analytical
chemistry.