----------------
Correct, if you know the actual resistance of *each* resistor, you can then use Ohm's Law to determine the voltage drop. Using vintage meters with a lower input impedance will yield decent results, using a modern meter with a much higher input impedance will yield more accurate results. This discussion was about which method is the most accurate, measuring cathode (as per Scott spec) or calculating voltage drop. Either method will work, but most technicians back then wouldn't go through the trouble of measuring the actual resistance of the resistors, therefore, the cathode current method was a better choice, since the resistance had no affect on the current reading.
The correct meter (non-switchable milliammeter) is not expensive, and can be purchased for $20 or less from any major distributor, and will produce the most accurate bias adjustments. Voltmeters do have a tolerance range, and whatever the tolerance may be, that will also be the tolerance of your bias adjustment, plus the resistor tolerance. The non-switchable milliammeter has a tolerance rating of 1/10 of a percent (that's close enough for me).
The tolerance rating of your voltmeter can be found in the owners manual for the meter. The tolerance rating of the resistors can be viewed by directly looking at the resistors.
A silver band means 10% tolerance, gold band means 5% tolerance.
Best Regards,
Ryan
----------------
Dear Mr. Expert!
You are surely the king when it comes to running around in circle.
Read again Mdeneen's posts please and try to think for yourself instead of pulling out again this "H H Scott said..." over and over again.
An engineer like you is suppose to think, not to worship.