Open the Samsung SDI INR21700-50E datasheet to page two. The discharge curves are there. 0.2C, 0.5C, 1C, 2C, at two or three temperatures depending on the revision. That graph contains more usable engineering information than everything else in the document combined, and most people skip it in favor of the capacity table on page one.
Rated capacity, 5000mAh, was measured at 0.2C. At 2C the cell delivers about 4600mAh. The Molicel P42A, rated 4200mAh, was designed for high-rate applications with thinner electrode coatings, and its 2C discharge curve sits proportionally closer to its 0.2C curve than the Samsung cell's does. On the discharge graph, two cells with different rated capacities can deliver similar energy at a given load. Neither datasheet shows you where that crossover happens.
Look at the area between the 0.2C curve and the 2C curve. That area is energy lost as heat. On the INR21700-50E at 2C, roughly 1.5 to 2Wh over a full discharge. In a 14S pack of 50 cells, 75 to 100W continuous.
Pack thermal design should start from this number.
Molicel's P42A lists DCIR at approximately 16mΩ. Comparing that number to another manufacturer's DCIR is probably the most frustrating exercise in cell selection, because the measurement is not standardized and the datasheet usually does not say enough about the test conditions to make any comparison meaningful.
Pulse duration drives the biggest discrepancy. A 1-second pulse captures ohmic resistance. Extend to 10 seconds and charge transfer resistance adds 30 to 40%. At 30 seconds, solid-state diffusion contributes too. The P42A reads 16mΩ at 1 second, roughly 22mΩ at 10, 25mΩ at 30. If a competing datasheet says 22mΩ, it could be the same cell measured differently.
SOC dependence is nonlinear. NMC resistance traces a U-curve, lowest around 50% SOC. LFP spikes approaching full charge, sometimes sharply enough to double the mid-SOC value. So a DCIR taken at 50% SOC on an LFP cell is the most flattering single point on the entire curve, and if the application charges to full daily, the 50% number has limited relevance to what the BMS actually sees during the topping phase.
At 0°C, DCIR roughly doubles. At minus 20°C, five to six times the room temperature value.
That last sentence is probably worth more engineering attention than anything else in this article, and it tends to get buried under the pulse-duration discussion. A pack designed at room temperature in a lab in Shenzhen, validated against the 25°C datasheet, ships to Finland and shuts down on the first cold morning. The battery is full. The BMS trips undervoltage because the loaded terminal voltage at minus 15°C dropped below the cutoff. The DCIR at that temperature was never calculated because the datasheet only reports 25°C, and the design engineer did not estimate it from the low-temperature discharge curves (if those curves were even provided). Molicel publishes enough cold data on the P42A to make the estimate. Most other manufacturers do not, and whether this is laziness or a deliberate choice to avoid drawing attention to cold-weather limitations is hard to say.
In post-mortem meetings this kind of failure usually gets pinned on the BMS firmware, because the fix (adjusting the undervoltage threshold or adding a temperature-dependent derating table) is a firmware change, which is cheaper and faster than a hardware revision. But the root cause is upstream: the pack power budget was built from room-temperature DCIR, nobody calculated what happens at minus 15°C, and the datasheet did not make it obvious that a calculation was needed. The information was technically derivable from the discharge curves. It was not in a form that demanded attention. Every cold-climate field failure report should trigger the same question: was terminal voltage at peak load at operating temperature checked against the undervoltage threshold across the full SOC range?
IEC 62660-1 and USABC define different measurement procedures. Most manufacturers use internal methods. The inconsistency will not change because switching procedures would break traceability with years of quality data.
There is a deeper issue with DCIR that goes beyond measurement standardization: the value on the datasheet is for a fresh cell. After 200 cycles, after 400 cycles, after a year of calendar aging at high SOC in a warm climate, the DCIR is different. How different? That depends on the cell chemistry, the cycling conditions, and the electrolyte quality, and it varies enough between manufacturers that a cell with a slightly higher initial DCIR but better impedance stability can outperform one with a lower initial number that degrades faster. The initial DCIR tells you what the cell can do on day one. Impedance growth tells you what the cell will do on day five hundred. Day five hundred is usually more relevant to warranty exposure.
Impedance growth over cycling is almost never published. A P42A at 16mΩ when new might read 24mΩ after 300 cycles, 35mΩ at 500. High-nickel NMC cells are vulnerable to a cathode microcracking feedback loop where cracked particles expose fresh surface, electrolyte decomposes on it, resistive films grow, and impedance rises. Past that knee, a cell can fail to sustain peak current even with 75% of its capacity remaining.
For applications where the cell must deliver high pulse current throughout its life (power tools pulling 30A, drones pulling 40A during takeoff, EVs during hard acceleration), the impedance knee defines end of life, not the capacity retention threshold. The datasheet publishes cycle life as "cycles to 80% capacity retention." In a power-limited application, the cell may become functionally useless at 85% capacity retention because impedance has grown enough to cause brownouts or protection trips at peak load. The 80% capacity figure is irrelevant. The impedance growth figure, which was never published, was the relevant end-of-life criterion all along.
When requesting impedance growth data from manufacturers, the response (or lack of response, or confusion about what is being asked) is itself diagnostic. A manufacturer that has characterized impedance growth through end of life has done serious lifecycle engineering. One that has not may still make a fine cell, but the burden of lifecycle characterization falls on the pack designer.
On the EVE LF280K, cycle life reads ≥6000 at 80% retention. 0.5C/0.5C, 25°C, full voltage window.
Retention is measured by periodic 0.2C reference discharges. The cell cycles at the nominal rate, pauses every 50 or 100 cycles, and then gets a slow 0.2C capacity check. That 0.2C reference is compared to the original 0.2C capacity. Capacity fade hits harder at higher rates, so a cell at 80% on the 0.2C reference might only deliver 73% at 1C. IEC 62660-1 defines this protocol.
LFP cycle life claims in the thousands are generally credible. NMC811 above 1000 full-DOD cycles is a harder sell because the microcracking degradation knee (same mechanism described in the DCIR section) can show up between 300 and 800 cycles. There is some uncertainty about how much the knee position depends on electrode calendering pressure versus electrolyte additive package versus formation protocol. Published literature tends to focus on cathode composition and voltage window. Manufacturing variables get less attention in journals because the data is proprietary, but cell engineers at the manufacturers know that two cells with identical cathode powder, calendered to different densities, can show measurably different knee onset points. This kind of knowledge does not make it onto datasheets or into papers. It stays inside the factory.
Jeff Dahn's lab at Dalhousie published two relevant papers (J. Electrochem. Soc., 2019, 166, A3031 and 2020, 167, 090506) on predicting cycle life from early coulombic efficiency measurements. Steady-state coulombic efficiency of 99.95% versus 99.98% corresponds to roughly double the parasitic lithium loss rate over hundreds of cycles. Their methodology, developed in partnership with Tesla, predicts retention divergence years before capacity curves split visibly. Almost no commercial datasheet publishes steady-state coulombic efficiency. Whether this will change as high-precision chargers become cheaper is unclear, but for now it remains the most information-dense specification that essentially nobody provides.
Voltage window. Charging to 4.1V instead of 4.2V on NMC and discharging to 3.0V instead of 2.5V sacrifices 20 to 30% per-cycle capacity, doubles or triples total cycle count. Tesla restricts the window on both Panasonic 2170 and CATL LFP cells; the 2019 OTA update reducing maximum SOC on certain Model S vehicles after fire reports was a voltage window tightening. Nissan's early Leaf (2011 to 2015), which used the full range of AESC NMC pouch cells without active thermal management, saw Arizona owners below 70% within four years. The datasheet specs were met. Phoenix summers at full SOC were not in the spec.
Calendar aging runs in parallel with cycle aging but datasheets test them separately from fresh cells. A cell sitting at 80% SOC and 35°C in a warehouse for six months before pack assembly starts its operational life with a deficit the datasheet does not account for. For equipment with long idle periods (backup power, seasonal tools), calendar aging may dominate cycle aging. The calendar life specification, when it exists, is typically less well characterized.
Charge rate effects: 0.5C charge might yield 800 cycles, 1.5C might yield 400 to 500, 3C perhaps 250. BMW's i3 charge algorithm for the Samsung SDI 94Ah prismatic used multi-stage current gated by temperature and SOC. The datasheet listed a single charge rate.
No datasheet discloses the electrolyte formulation. It determines more of what appears on the front page than electrode chemistry does for cells of comparable design, and the fact that it is completely absent from every datasheet is maddening.
Same electrodes, same separator, same loading, different electrolyte: different cold-temperature capacity, different gassing, different calendar aging. When a datasheet revision shows performance improvements without geometry or electrode changes, the electrolyte almost certainly changed. The revision log will not say so.
Shenzhen Capchem and Guangzhou Tinci supply most of the Chinese cell industry. Japanese and Korean manufacturers source from Mitsubishi Chemical, Ube Industries, and others. Central Glass supplies LiFSI, which is showing up in more premium formulations blended with LiPF₆ for better cold conductivity. The specific additive package (VC, FEC, PS, and proprietary compounds that do not appear in published literature) explains most of the performance gaps between cells with similar electrode stacks. Electrode-level teardowns miss it. GC-MS on the electrolyte catches it.
There is a tendency in competitive teardown reports to fixate on electrode thickness, cathode stoichiometry, and coating weight as if those numbers explain all the performance differences between two cells. They do not. Two cells with identical NMC622 cathodes, identical graphite anodes, identical loading, identical separator, from two different manufacturers, can differ by 30% in cycle life and 40% in calendar life because of what is dissolved in the liquid between the electrodes. Capchem's base electrolyte with a conservative VC-only additive package and Tinci's higher-margin formulation with FEC plus a proprietary co-additive for high-nickel cathodes produce measurably different cells even when everything else is held constant. The cell manufacturer may not even fully understand the additive chemistry, because the electrolyte supplier's formulation is itself proprietary. There are layers of trade secrets here that no datasheet penetrates.
Cell manufacturers switch electrolyte formulations between production runs without changing the cell model number. A cell produced in Q1 may have a different additive package than the same model number from Q3 because the electrolyte supplier updated their formulation or because pricing shifted the cost optimization. The only external signal of this is a shift in incoming inspection data, and most pack manufacturers do not inspect at the resolution needed to catch subtle changes.
Pouch cells are characterized under 50 to 200 kPa of external compression. Every datasheet spec comes from a compressed cell. The datasheet does not say so. The compression requirement lives in a separate application note. Loosely seated pouch cells lose 30% or more of cycle life. LG Energy Solution's Bolt EV recall in 2020, covering roughly 140,000 vehicles, was an integration-domain failure with cells that met their datasheet specs.
Formation cycling at the factory consumes 8 to 12% of the cell's lithium building the initial SEI. Two production batches formed under slightly different thermal conditions because of seasonal HVAC variation can differ by 30mAh on a 3000mAh cell, both within spec but producing measurable pack-level differences when those batches end up in the same series string. Root cause identification requires asking the manufacturer's process engineering team directly.
Capacity on the datasheet comes from the third or fifth post-formation grading cycle, depending on the manufacturer's procedure. Different grading cycles produce slightly different numbers. The discrepancy is under 1%. CATL and Samsung SDI publish both typical and minimum capacity on prismatic datasheets; a datasheet listing only a typical value does not support worst-case analysis.
CATL has factories in Ningde, Liyang, Shanghai, and elsewhere. LG Energy Solution is in Ochang, Wroclaw, Holland Michigan. Same product line from different sites produces measurably different cells. Incoming inspection at the application C-rate, plotted lot by lot, matters more than the datasheet for production decisions.
Low-temperature capacity on LFP gets misread. A CATL 280Ah at minus 20°C delivers about 180Ah. The remaining 100Ah is still in the cathode. The electrolyte viscosity and charge transfer kinetics at that temperature prevent the cell from sustaining the demanded rate before hitting voltage cutoff. The capacity-temperature table shows one number per temperature with no rate axis. The derating is rate-dependent: 200Ah at 0.2C and minus 20°C, 180Ah at 0.5C, maybe 150Ah at 1C. NMC handles cold better because of higher ionic conductivity and more voltage headroom.
On the INR21700-50E, synthesizing the scattered data: at 2C and minus 10°C, delivered energy drops to roughly 55 to 60% of the headline 5000mAh at 0.2C and 25°C. Whether a cold-climate pack needs 10 cells or 17 depends on that number.
Samsung SDI's Note 7 cells in 2016 met published specs. Manufacturing defects cleared the screen. The self-discharge number on the datasheet gives no visibility into screening rigor. Incoming self-discharge testing at elevated temperature with millivolt OCV tracking is the best available external proxy. Low-temperature charging below 0°C plates lithium; if the datasheet specifies reduced charge rate for 0 to 10°C, follow it, and if silent, do not charge below 0°C. CV termination at 0.05C leaves 2 to 3% uncharged; at 0.1C, 5 to 8%. LFP fuel gauging runs into the flat plateau and 50 to 100mV hysteresis, making single OCV-SOC tables unreliable by 5 to 10 percentage points. Tesla's 2021 LFP Model 3 had SOC jumping complaints and OTA updates followed. Datasheet revisions where max operating temperature drops from 60°C to 55°C are field data. "UL 1642 certified" without a certificate number is not verifiable.