Home

argentometric

Argentometric titration, or argentometry, is a class of volumetric analysis in which silver ions (Ag+) are used as the titrant to determine the quantity of analytes that form insoluble silver salts, most commonly halide ions such as chloride, bromide, and iodide.

The technique relies on the precipitation of silver halides and on endpoint detection that indicates either

Back-titration methods, notably the Volhard method, use an excess of AgNO3 to ensure complete precipitation of

Other argentometric approaches employ adsorption indicators (the Fajans method), in which a color change at the

Applications and limitations: Argentometry is commonly used to determine halide content in water, food, and pharmaceutical

the
presence
of
remaining
free
Ag+
or
the
formation
of
a
distinct
silver
salt.
In
direct
argentometry
(the
Mohr
method),
silver
nitrate
is
added
to
a
solution
containing
halide
ions
in
slightly
acidic
medium.
As
long
as
halide
ions
are
present,
Ag+
precipitates
as
AgCl,
AgBr,
or
AgI.
The
endpoint
is
reached
when
all
halide
has
been
precipitated
and
a
small
excess
of
Ag+
remains,
detected
by
an
indicator
such
as
chromate,
which
forms
a
brick-red
Ag2CrO4
at
the
endpoint.
halides.
After
the
reaction,
the
remaining
Ag+
is
titrated
back
with
a
thiocyanate
solution
(SCN−)
using
an
appropriate
indicator
(often
involving
Fe3+/Fe(SCN)2+
chemistry).
This
approach
is
useful
in
samples
with
color,
turbidity,
or
interfering
substances
that
complicate
direct
detection.
particle
surface
signals
the
endpoint.
These
variations
broaden
the
applicability
of
argentometry
to
different
sample
types
and
sensitivities.
preparations.
Interferences
arise
from
complexing
or
non-volatile
species
that
bind
Ag+,
requiring
careful
pH
control,
sample
preparation,
and
choice
of
indicator.
The
Mohr
and
Volhard
methods
remain
foundational
in
inorganic
analysis
for
halides
and
related
species.