When wanna calculate true amp draw of a cell, then am I correct in the following:
With an inline-voltmeter, and a mech with a 0.25ohm build and a fresh cell, then lets say that the resting voltage is 4.16v and then under load it sags to 3.8v.
Of course the VD(battery-sag-mostly from IR, but nonetheless) can be calculated to 0.36v, but what I was interested in getting confirmed was that the actual true amp draw of said cell is calculated from the voltage under load i.e. 3.8v, and not the resting voltage of 4.16v, correct; i.e. 3.8 / 0.25 = 15.2a.
Please note, i'm not talking about safety here, which always is calculated from full voltage i.e. 4.2v, but I just wanna get confirmed that it's correct that when talking true amp draw then it's calculated from voltage-under-load(of course I know the meter also is about 0.4% imprecise).
With an inline-voltmeter, and a mech with a 0.25ohm build and a fresh cell, then lets say that the resting voltage is 4.16v and then under load it sags to 3.8v.
Of course the VD(battery-sag-mostly from IR, but nonetheless) can be calculated to 0.36v, but what I was interested in getting confirmed was that the actual true amp draw of said cell is calculated from the voltage under load i.e. 3.8v, and not the resting voltage of 4.16v, correct; i.e. 3.8 / 0.25 = 15.2a.
Please note, i'm not talking about safety here, which always is calculated from full voltage i.e. 4.2v, but I just wanna get confirmed that it's correct that when talking true amp draw then it's calculated from voltage-under-load(of course I know the meter also is about 0.4% imprecise).
Last edited: