Exactly why you can not automatically trust pre-built coils or blindly trust a friend's suggestion for accuracy. The difference between 0.17 ohm and 0.25 ohm is huge at sub-ohm range. Some people argue that using a cheap $10 ohm reader is risky down to a tenth/hundreth of an ohm when a $200 Fluke digital meter has to be recalibrated every year to be precisely accurate.
Even if you use a $10 ohm reader, don't use a coil resistance that your battery's continuous discharge rating (amp rating) can not fire safely. The battery's CDR isn't a suggestion. It's the safe operating limit of the cell.
Everyone is free to set their own safety parameters, and I can only say what mine are.
I try to never exceed 50% of the CDR (continuous discharge rating) of a fully charged battery (4.2v). So with a 20A batteries, that would be 10A. An Ohm's Law Calculator tells me that a .4 ohm build is as low as I would want to use.
The reason that I place a 50% limit is because as a battery ages the mAh of the battery degrades, as the mAh degrades so does the batteries c rating (amp limit). So down the road, your 20A battery may only be a 10A battery.
This was very helpful. Appreciate the time your taking to help me out.
I like the idea on 50% of CDR hence the reason i wanted to get a 30A battery. Currently runnning 16.8 amps on a 20A Battery so im pushing it for sure.