Abrazolica


Home     Archive     Tags     About        RSS
Measuring Voltmeter Input Impedance

Sometimes you have to measure a voltage that has a high output impedance. To do this accurately you need to know the input impedance of your voltmeter. Ideally it should be much much larger than the impedance of what you're trying to measure. The following figure illustrates the situation.

The voltage we're trying to measure is \(V_x\) and it has an output resistance of \(R_x\). The meter has an input resistance of \(R_m\). The voltage measured by the meter will then be

\[V = \frac{V_xR_m}{R_x+R_m} = \frac{V_x}{1+R_x/R_m}\]

So you can see that \(V\approx V_x\) only when \(R_m>>R_x\). If you happen to know the values of \(R_x\) and \(R_m\) then you can determine what \(V_x\) is even when the two resistances are of the same magnitude. How do you measure the input resistance of your voltmeter? If you apply a known \(V_x\) using a known \(R_x\) then you can solve for \(R_m\) in the above equation.

\[R_m = \frac{R_x}{\frac{V_x}{V}-1}\]

We did this for the old Scope multimeter you see in the following picture using \(V_x=12v\), \(R_x=1M\Omega\).

The meter read \(10.98v\) which gave us \(R_m=10M\Omega\). This is pretty typical for most handheld multimeters.

Some really cheap multimeters such as the one shown below that we got at Harbor Freight Tools showed an input resistance of only \(1M\Omega\). You have to be careful about what you measure with a meter like that. In some cases it can give you very inaccurate readings.

This post as a pdf


© 2010-2023 Stefan Hollos and Richard Hollos

submit to reddit   

blog comments powered by Disqus