Home

voltmeters

A voltmeter is an instrument for measuring the electrical potential difference between two points in a circuit. To avoid perturbing the circuit, it is designed with high input impedance and is usable for DC or AC voltages. Readings are given in volts with specified accuracy and range.

Analog voltmeters typically rely on a moving-coil galvanometer plus a calibrated series resistor to convert voltage

Digital voltmeters use analog-to-digital converters to display voltage numerically. They generally provide higher input impedance, can

Key considerations include input impedance, loading effects, accuracy class, range selection, probe quality, and safe connection

Voltmeter development progressed from galvanometer-based instruments in the 19th century to electronic and digital devices. Today

into
a
measurable
current
that
moves
a
pointer.
They
require
careful
calibration
and
can
suffer
from
parallax
error,
scale
nonlinearity,
and
sensitivity
to
temperature
and
magnetic
fields.
measure
DC
and
AC,
and
may
offer
true
RMS,
auto-ranging,
and
data
logging.
Digital
multimeters
usually
include
a
voltmeter
function
with
additional
features.
practices,
especially
at
higher
voltages.
For
precise
work,
buffers,
differential
probes,
or
voltage
dividers
may
be
used.
voltmeters
are
essential
in
laboratories,
industry,
and
consumer
electronics,
appearing
as
standalone
meters
and
as
functions
within
multimeters.