I asked our tech Jason and the response is pretty cool and makes
sense. Without exact specs on the meters, assume VDD is 3.0 volts and then this makes perfect
sense:
The microcontroller (uC) on the Voltage indicator can directly read Voltages that are below its maximum operating Voltage (Vdd). In this scenario, there's no major loss of accuracy in the circuitry and the third digit is accurate. If the Voltage to be measured is greater than Vdd, the Voltage must first be divided using a resistor divider. The tolerance of the resistors determines what the final accuracy is. So, if the two resistors each had a 1% tolerance, then the output of the divider could only be accurate to 2%. 2% of 4.4V is 0.088, meaning that for a verified Voltage of 4.40V, the reading could be anywhere between 4.31V to 4.49V. If the third digit were displayed, it would only be a relative value. There's another issue too, that is that the ADC of the uC only has a 10bit resolution. If the whole range up to 25V is to be broken down into 10bits, so that a single divider can be used, then the resolution can only be as high as 25V/(2^10-1) = 0.0244V.
BOTTOM LINE: Above Vdd, the third digit is not precise.
hoog