Home

electrometer

An electrometer is an electrical instrument designed to measure electric charge, voltage, or current with extremely high input impedance. By presenting a minimal loading of the circuit under test, electrometers can detect and quantify very small signals that are not measurable by ordinary voltmeters or ammeters. Modern devices are capable of measuring currents from femtoamps to picoamps, charges from tens of picocoulombs, and voltages in modest ranges, with high stability and low noise.

Electronic electrometers typically use a very high-impedance input stage, such as a MOSFET or JFET, followed

Types historically include vacuum-tube electrometers and modern solid-state electrometers. The latter dominate today, offering greater stability,

Applications span physics research, radiation dosimetry, electrostatic measurements in materials science, capacitor testing, and semiconductor device

by
a
precision
amplifier
and
a
transimpedance
or
capacitive
front
end.
They
may
operate
in
voltage,
current,
or
charge
integration
modes.
Some
designs
employ
a
capacitor-resistor
feedback
network
to
convert
input
current
to
a
voltage,
while
others
integrate
charge
on
an
isolated
capacitor
and
read
out
the
accumulated
charge.
Many
electrometers
include
guard
rings,
shielded
enclosures,
and
careful
thermal
design
to
minimize
leakage
and
drift.
lower
noise,
and
higher
input
impedance,
typically
in
the
range
of
10^12
to
10^16
ohms.
Calibration
against
known
standards
is
essential,
as
bias
currents,
leakage,
and
dielectric
absorption
can
affect
accuracy.
characterization.
Electrometers
are
often
used
with
charge-sensitive
detectors,
ionization
chambers,
or
specialized
sensors
to
quantify
tiny
electrical
quantities
without
disturbing
the
measurements.