Why bother with testing cell temperature?

Published by Mooch in the blog Mooch's blog. Views: 1596

As an example, suppose one cell had a 30A continuous discharge rating (CDR) and another cell had a CDR of 20A. The 30A cell sounds like it would be a better choice for use in a device that draws lots of current. But if it rises to a temperature of 120°C delivering that current, that cell is overrated and will have a very short life...if it doesn't vent or burst first. The 20A cell might actually be a better choice depending on how hot it gets. Something you could determine if you had the temperature data.

I feel strongly that temperature must be a part of any cell testing or otherwise the tests can't be used to compare cells. These tests are my first step in trying to get a handle on how we can give these cells true CDR's. Based not only on capacity and voltage-under-load but also based on how safe they are to use at different discharge current levels.

Manufacturers rate their cells for use at temperatures up to 60°C, maximum. At temperatures exceeding about 45°C a cell's aging accelerates, shortening its life. At 70°C-80°C a cell starts increasing its self-heating due to additional exothermic chemical reactions. If this self-heating is not stopped, or the heat pulled away by cooling, it will eventually lead to venting, bursting, and possibly thermal runaway. At approximately 120°C an important component in a cell starts melting (the separator), leading to short-circuiting and more self-heating. This is a point where the cell starts to be in big, big trouble. And so is anyone using that cell.

Different chemistries have different temperature thresholds for thermal runaway but all suffer similarly at temperatures below this (accelerated aging, exothermic reactions creating gas and increased internal pressure, separator melting, etc) which can lead to venting and/or bursting. It's why I did not differentiate between the chemistries when setting the maximum temperature I would let a cell reach before stopping a test.

Testing ICR cells (LiCoO2, "lithium-cobalt-oxide", "LCO") is riskier than testing the IMR (LiMn2O4, "lithium-manganese-oxide", "LMO") cells we normally recommend for use in a vaping device. This is due to the lower thermal runaway temperature of lithium-cobalt. It makes measuring of the cell temperature during a discharge all the more important.

I have set a safety limit of 100°C for all of my tests, which is a ridiculously high temperature to operate a cell at! But I know that vapers will always want to reduce device size by reducing the number of cells so we'll go as hot as we can without getting too close to thermal runaway. Know that operating at over about 45°C reduces cell life though. If the cell exceeds 100°C before completing a discharge at its continuous discharge rating (CDR), then the cell is definitely overrated. It's just to dangerous to use continuously at that discharge current level. Under certain circumstances I'll let the discharge continue even if the cell temperature is above 100°C. But this is guaranteed to damage the cell and might lead to venting or thermal runaway.

For reasonable cell life, I have set a limit of 75°C. While this is high enough to speed up the aging of the cell, it will still allow using the cell for a reasonable amount of time before needing to replace it. Beware of using any cell at higher temperatures than this. Not only can the damage become quite severe very quickly but it also takes you closer to the temperature at which the cell could vent.

I realize that vaping does not discharge the cell continuously and that it will run cooler when used in a device, even if each time the device is fired it draws current equal to the cell's CDR. But we must have a safety margin when using these cells! If a device autofires then knowing that the cell you have picked will not vent, or worse, is very important. And a cell that is short-circuited might not destroy itself, and your device, if we can pick the one that runs cooler at high discharge current levels. This can only be done if we know how hot these cells get.

I would love to see the ECF community come together to create a set of standardized test requirements to use when comparing cells and determining their safety at different discharge current levels. Using these tests we could set an accurate and safe current limit for each cell. Not just for continuous current, but also for "pulse" current testing that better simulates what happens when cells are actually used in a device. Additional tests could include cell leakage rate (good for estimating degree of damage to a cell), internal resistance, total joules delivered for each discharge current level (not a test, just some math), and cycle life testing.
You need to be logged in to comment