Home

Voltmeter

A voltmeter is an instrument used to measure the electrical potential difference, or voltage, between two points in an electrical circuit. To avoid altering the circuit, it is connected in parallel with the element under test. An ideal voltmeter has infinite input impedance, meaning no current is drawn from the circuit; real meters have finite but typically high input resistance to minimize loading errors.

There are two main types of voltmeters: analog and digital. Analog voltmeters use a galvanometer with a

AC and DC voltmeters differ in handling alternating or direct currents. DC voltmeters display the instantaneous

Applications include testing power supplies, components, and circuits, troubleshooting, and validating implementations in electronics labs and

calibrated
series
resistor
network
to
extend
the
range.
The
current
through
the
galvanometer
is
proportional
to
the
voltage
across
the
test
points,
and
the
pointer
deflection
is
read
from
a
scale.
Digital
voltmeters
convert
the
input
voltage
to
a
digital
value
using
an
analog-to-digital
converter
and
display
it
as
a
numeric
readout.
Digital
meters
generally
provide
higher
accuracy
and
easier
reading,
and
often
offer
higher
input
impedance
through
buffering.
or
average
voltage,
while
AC
or
true-RMS
voltmeters
measure
the
effective
voltage
of
an
alternating
signal.
Some
meters
are
true-RMS
capable,
while
others
are
average-responding
and
require
a
conversion.
industry.
Typical
voltage
ranges
span
from
millivolts
to
kilovolts,
with
appropriate
safety
and
isolation
considerations.
Calibration
and
specification
adherence
are
important
for
accuracy,
and
meters
may
include
additional
features
such
as
auto-range,
data
logging,
and
input
protections.