Question about battery charging

Status
Not open for further replies.

Bo0m5l4ng

Full Member
Oct 22, 2014
41
26
Tennessee
I went to school for computer science, so I know just enough about electrical engineering to get myself in trouble, so I thought I was ask a few questions to make sure I am on the right page.

For charge current, I know the maximum recommended is a fraction of the C rating, typically 0.5, but at the same time, I am aware that trickle charging can have negative effects due to prolonged chemistry crap going on.

However, theoretically, the slower you charge, the higher capacity, due to the 'rubber band' effect batteries seem to have. The harder you hit them with charge current, the more they will rebound back down when you take them off the charge.

So it seems like that tapering off the charge voltage when it nears full charge would be ideal. However, wouldn't a constant voltage supply do this already, since the amperage is based on resistance and voltage, and as it nears 4.2v, the effective voltage decreases as well(0.2v for 4v-4.2v for instance). Or do batteries typically get charged with constant current instead until 4.2v is read from terminal to terminal?

I am not planning on making a charger, but I would like to understand how they function.
 
  • Like
Reactions: stols001

Mooch

Electron Wrangler
ECF Veteran
Verified Member
  • May 13, 2015
    3,946
    15,442
    I went to school for computer science, so I know just enough about electrical engineering to get myself in trouble, so I thought I was ask a few questions to make sure I am on the right page.

    For charge current, I know the maximum recommended is a fraction of the C rating, typically 0.5, but at the same time, I am aware that trickle charging can have negative effects due to prolonged chemistry crap going on.

    However, theoretically, the slower you charge, the higher capacity, due to the 'rubber band' effect batteries seem to have. The harder you hit them with charge current, the more they will rebound back down when you take them off the charge.

    So it seems like that tapering off the charge voltage when it nears full charge would be ideal. However, wouldn't a constant voltage supply do this already, since the amperage is based on resistance and voltage, and as it nears 4.2v, the effective voltage decreases as well(0.2v for 4v-4.2v for instance). Or do batteries typically get charged with constant current instead until 4.2v is read from terminal to terminal?

    I am not planning on making a charger, but I would like to understand how they function.

    True trickle charging, continuous charging at a lower voltage after 100% is reached, is terrible for lithium cells. But low current charging to 100% charge and then disconnecting the cell from the charging voltage is perfectly okay.

    Check my blog for my charging rates table with some slow, standard, and rapid charge rates from OEM cell datasheets.

    That “rubber band” effect is just the effect of internal resistance (IR) on the charge termination voltage. The IR causes a voltage rise during charging, the opposite of the sag it causes during discharging, and this fools the charger into thinking the cell is further along in its charging cycle than it really is so it turns off sooner. This results in a less than complete charge that gets worse as the charge current increases.

    A low enough termination current setting will prevent this but many chargers just use 1/10 of the bulk charge current rate to know when to stop charging and this causes this problem. A fixed low current value would allow a complete charge no matter how high the bulk charging rate was.

    But, a higher bulk charge rate forces the cell into the CV portion of the charge sooner, due to the voltage rise from the cell’s IR, reducing the current flow sooner and extending the charging time. For example, going from 0.25A to 0.5A charging might halve the charge time but going from 0.5A to 2A will not quarter it. It might be halved though.

    Standard CC/CV Li-Ion charging profiles uses a CC source set to 4.20V. The full current flows until the battery terminal voltage reaches 4.2V when the CV phase starts and the current then starts tapering off. The charger does nothing different the entire time. These two phases are just a consequence of the circuit topology. Nothing different happens in the charger.

    At a designated taper/termination current level the charger determines that the cell is fully charged and the charging is stopped.

    The battery’s true internal voltage is different from what the charger reads though, due to the effects of its IR. This IR is a combination of typical ohmic resistances and a “polarization resistance”, a gradient in the density of the ions across elements of the battery causing a voltage differential. The effects of both of these disappear when current flow stops but it can take a couple of hours to settle to its true resting voltage where all the ions have distributed evenly across the anode. After a few minutes though it’s pretty close to the resting voltage.
     
    Last edited:

    Imfallen_Angel

    Ultra Member
    ECF Veteran
    Apr 10, 2016
    1,711
    2,763
    Ottawa area, Canada
    Almost all modern /decent chargers, AND MODS, do this.. they'll push a higher charge until it reads the batteries at around 80 to 90% of their rated values and then trickle charge.

    I've got over 35 years in electronics and such and this is a recognized approach to maximize both battery life and charging time required. It's been recognized for many years, and applies to all rechargeable batteries, from the old cadmium types, to car batteries.

    But that said, many companies have jumped onto to "fast" bandwagon because people refuse to wait, so there's been more and more 2A charging available, and depending on some larger batteries, that's fine, but for 18650, I will continue to go with 1A max for a "fast" charge and 0.5 for slow.

    Take the Nitecore D4 for example, with 2 batteries, it goes at 750mA for the main charging phase, then trickles the top-off at half that.

    Most decent mods will do similar.

    The goal is to not have the battery heat up (as much as possible) while charging, heat is what will cause the chemical composites to "wear out", causing shorter lifespan, and possibly causing the battery to get unstable.
     

    Mooch

    Electron Wrangler
    ECF Veteran
    Verified Member
  • May 13, 2015
    3,946
    15,442
    Almost all modern /decent chargers, AND MODS, do this.. they'll push a higher charge until it reads the batteries at around 80 to 90% of their rated values and then trickle charge.

    I've got over 35 years in electronics and such and this is a recognized approach to maximize both battery life and charging time required. It's been recognized for many years, and applies to all rechargeable batteries, from the old cadmium types, to car batteries.

    But that said, many companies have jumped onto to "fast" bandwagon because people refuse to wait, so there's been more and more 2A charging available, and depending on some larger batteries, that's fine, but for 18650, I will continue to go with 1A max for a "fast" charge and 0.5 for slow.

    Take the Nitecore D4 for example, with 2 batteries, it goes at 750mA for the main charging phase, then trickles the top-off at half that.

    Most decent mods will do similar.

    The goal is to not have the battery heat up (as much as possible) while charging, heat is what will cause the chemical composites to "wear out", causing shorter lifespan, and possibly causing the battery to get unstable.

    Does the D4 actually change the charge rate when nearing full?
    Or does it just naturally taper off as a result of the equalizing voltages and expected tapering current that results? At 85%-90% the charge current rate naturally falls off anyway. There’s no need to switch it to a lower level.

    I ask because I’ve never seen one of these chargers actually switch charge rates, just taper off.

    I agree with you regarding fast charging. Our phones and tablets use it due to our demands for fast-as-possible charging but it comes with a price, shortened battery life.
     

    Imfallen_Angel

    Ultra Member
    ECF Veteran
    Apr 10, 2016
    1,711
    2,763
    Ottawa area, Canada
    Does the D4 actually change the charge rate when nearing full?
    Or does it just naturally taper off as a result of the equalizing voltages and expected tapering current that results? At 85%-90% the charge current rate naturally falls off anyway. There’s no need to switch it to a lower level.

    I ask because I’ve never seen one of these chargers actually switch charge rates, just taper off.

    I agree with you regarding fast charging. Our phones and tablets use it due to our demands for fast-as-possible charging but it comes with a price, shortened battery life.

    I've never sat and watch it to see it switch, but when I did check on it periodically while charging, I did see that it did change the charging rate at around the 80% range.

    According to the specs I've read, (and stated in their descriptions) D4 it's supposed to monitor the battery and adjust the charging rate. And everything I've tested with it appears to confirm that.

    I tried some AA and AAA that my usual charger was having issues charging them correctly and the D4 did put a bit of "new" life into them... some took over a day to charge, and when I'd check on the status, I would see different charge levels happening regardless of the voltage reading.

    So simple answer: I'd go with "yes, I believe so".
     
    • Like
    Reactions: stols001

    Mooch

    Electron Wrangler
    ECF Veteran
    Verified Member
  • May 13, 2015
    3,946
    15,442
    I've never sat and watch it to see it switch, but when I did check on it periodically while charging, I did see that it did change the charging rate at around the 80% range.

    According to the specs I've read, (and stated in their descriptions) D4 it's supposed to monitor the battery and adjust the charging rate. And everything I've tested with it appears to confirm that.

    I tried some AA and AAA that my usual charger was having issues charging them correctly and the D4 did put a bit of "new" life into them... some took over a day to charge, and when I'd check on the status, I would see different charge levels happening regardless of the voltage reading.

    So simple answer: I'd go with "yes, I believe so".

    Thanks.
    I’m not sure anyone should be relying on functionality descriptions from companies who don’t design the chargers along with the translation issues that are always present.

    That charge rate switching may be the starting charge rate, not a continual monitoring and adjustment of the rate.

    Watch a charge more closely, check the current draw every minute once you get close to full. I think you’ll see a smooth transition from the bulk charge rate down to the termination current level. There’s just no need to add that extra circuitry and functionality with its associated cost. The current tapering happens all by itself.

    Every charger algorithm and charge controller chip I’ve seen (hundreds), and the one’s I’ve used for the chargers I’ve designed, don’t switch rates. But, I’m open to the idea of some company using this as some sort of “feature” to try to differentiate their product. I’ll see if I can get a D4 and check the charge.

    But you said almost all modern chargers do this so I am confused there as it’s not part of the charging algorithm for the hundreds of charge controller chips I've read the datasheets for.

    Thanks again!
     

    Imfallen_Angel

    Ultra Member
    ECF Veteran
    Apr 10, 2016
    1,711
    2,763
    Ottawa area, Canada
    Thanks.
    I’m not sure anyone should be relying on functionality descriptions from companies who don’t design the chargers along with the translation issues that are always present.

    That charge rate switching may be the starting charge rate, not a continual monitoring and adjustment of the rate.

    Watch a charge more closely, check the current draw every minute once you get close to full. I think you’ll see a smooth transition from the bulk charge rate down to the termination current level. There’s just no need to add that extra circuitry and functionality with its associated cost. The current tapering happens all by itself.

    Every charger algorithm and charge controller chip I’ve seen (hundreds), and the one’s I’ve used for the chargers I’ve designed, don’t switch rates. But, I’m open to the idea of some company using this as some sort of “feature” to try to differentiate their product. I’ll see if I can get a D4 and check the charge.

    But you said almost all modern chargers do this so I am confused there as it’s not part of the charging algorithm for the hundreds of charge controller chips I've read the datasheets for.

    Thanks again!
    I'd say that it's probably a bit of both, the tapering effect is probably a given in it's algorithm, but at the same time, I'd go with the charger does appear to be monitoring the batteries and adjusting the rate, as I've seen it handle multiple batteries that were placed in the charger at the same time and the rate, after the minute or two of the charger doing it's analysis, would charge them at different rates.

    But what I saw when I did the tests with the batteries that were giving me issues, was that the charger was definitely doing something more, with the rate changing multiple times, as if it was trying out a certain level, analysis the readings for a while, and then trying another setting.

    I'll admit that I didn't hook it up to any testing devices as I simply don't have the time to do full analysis of how Nitecore have built their circuitry and algorithm, but at a glance/quick basic observations of it's behavior, it does appear to do what they claim.

    The "all modern" comment is more about the tapering off, but from the ones I've seen, it's also that some monitoring is now present, as they do shut off once the batteries are full, and then monitor the battery at intervals to check if the battery did lose some of it's charge, and don't simply stay on low "trickle" like older ones used to do.

    It's a bit similar to charging via mods, (from very simple tests using a usb cord that monitors power rate) I was surprised how some do actual by-pass (where one can vape while the mod being charged in and the incoming rate stays stable), others appear to switch to low input and use the battery (and the incoming power rate is lowered), and others will turn the incoming flow off completely as you fire the mod. It's been quite interesting to see the changes in technology in just the last 10-some years.
     

    Imfallen_Angel

    Ultra Member
    ECF Veteran
    Apr 10, 2016
    1,711
    2,763
    Ottawa area, Canada
    Just to add, if you go and take one apart and do a full analysis, I'll be very interested to see what you find.

    As I said, I did very simple tests and observation in it's behavior, and it might be just a case that they really did some very fancy foot work with how their algorithm functions... but regardless, I do find that in terms of this charger doing a splendid job, it's been an excellent charger and it's won my confidence.
     
    Status
    Not open for further replies.

    Users who are viewing this thread