Atomfair Brainwave Hub: Battery Science and Research Primer / Battery History and Fundamentals / Battery terminology
Battery capacity and energy metrics are fundamental concepts in evaluating electrochemical energy storage systems. These parameters define how much charge a battery can store and deliver, as well as how much energy it contains relative to its size or weight. Understanding these terms requires precise definitions and awareness of the conditions under which they are measured.

Rated capacity refers to the manufacturer-specified amount of charge a battery can deliver under defined conditions, typically expressed in ampere-hours (Ah) or milliampere-hours (mAh). This value represents the theoretical maximum under ideal circumstances, usually measured at a standard discharge rate (often 0.2C or C/5) and at room temperature (25°C). For example, a 18650 lithium-ion cell might have a rated capacity of 3.5 Ah when discharged over 5 hours at 25°C. The C-rate indicates the discharge current relative to the battery's capacity, where 1C means discharging the full capacity in one hour.

Actual capacity differs from rated capacity due to real-world operating conditions. It measures the practical charge delivery under specific discharge rates, temperatures, and aging states. A lithium-ion battery rated at 3.5 Ah might only deliver 3.2 Ah when discharged at 1C rate or 2.8 Ah at -10°C. The difference between rated and actual capacity grows with increasing discharge rates and temperature deviations from the optimal range. Nickel-metal hydride batteries show similar behavior, while lead-acid batteries exhibit more pronounced capacity reduction at low temperatures.

Energy density describes the amount of energy stored per unit volume, measured in watt-hours per liter (Wh/L). This parameter determines how compact a battery can be for a given energy storage requirement. Lithium-ion batteries typically achieve 250-700 Wh/L, with lithium cobalt oxide variants at the higher end and lithium iron phosphate at the lower end. In comparison, nickel-cadmium batteries offer 50-150 Wh/L, and lead-acid batteries provide 50-90 Wh/L.

Specific energy, also called gravimetric energy density, indicates energy storage per unit mass, expressed in watt-hours per kilogram (Wh/kg). This metric is crucial for weight-sensitive applications like electric vehicles and aerospace. Modern lithium-ion batteries achieve 150-300 Wh/kg, with advanced lithium-sulfur prototypes reaching 400-500 Wh/kg. Traditional lead-acid batteries offer only 30-50 Wh/kg, while nickel-metal hydride provides 60-120 Wh/kg.

Volumetric energy density and specific energy vary significantly among battery chemistries due to differences in electrode materials and cell construction. Lithium-ion batteries outperform most alternatives in both metrics because of their high-voltage electrodes and lightweight materials. Sodium-ion batteries, while similar in design to lithium-ion, typically show 20-30% lower values due to the heavier sodium ions and lower cell voltages. Solid-state batteries promise improvements in both metrics by enabling lithium metal anodes and higher voltage cathodes.

Measurement conditions critically affect reported capacity and energy values. Temperature influences electrochemical reaction rates and ion mobility. Most batteries show optimal performance around 25°C, with capacity reductions of 10-30% at 0°C and 50% or more at -20°C. High temperatures above 45°C can accelerate side reactions while temporarily improving performance. Lithium-ion batteries are particularly sensitive to low-temperature operation due to increased electrolyte viscosity and slower lithium-ion diffusion.

Discharge rate, expressed as a C-rate, equally impacts measured capacity. A battery discharged at 0.1C (10-hour rate) will typically deliver 5-15% more capacity than when discharged at 1C (1-hour rate). This effect stems from kinetic limitations in ion transport and charge transfer reactions. Lithium iron phosphate batteries exhibit flatter discharge curves and better rate capability than lithium cobalt oxide, making their capacity less sensitive to discharge rate. Lead-acid batteries show pronounced capacity reduction at high rates due to sulfate formation and acid depletion.

Standardization bodies define measurement protocols to ensure comparability. The International Electrotechnical Commission (IEC) specifies test methods for various battery types, including discharge rates, voltage cutoffs, and temperature conditions. For lithium-ion batteries, capacity is typically measured from 100% state of charge to a defined cutoff voltage (often 2.5-3.0V per cell) at specified C-rates. These standards help align manufacturer specifications with real-world expectations.

The relationship between capacity and energy involves the battery's average discharge voltage. Energy (Wh) equals capacity (Ah) multiplied by voltage (V). Thus, two batteries with identical capacities but different chemistries can store different energies. For example, a 3.6V lithium-ion cell and a 1.2V nickel-metal hydride cell with 2 Ah capacity store 7.2 Wh and 2.4 Wh respectively. This voltage difference explains why lithium-ion systems dominate high-energy applications despite similar capacity ratings to other chemistries.

Battery state of charge (SOC) represents the available capacity relative to the maximum capacity at a given time. A 100% SOC indicates full rated capacity is available, while 0% SOC means no usable capacity remains under standard discharge conditions. However, SOC definitions become complex at different discharge rates and temperatures, as the available capacity changes. Battery management systems use voltage, current integration, and sometimes impedance measurements to estimate SOC accurately.

The distinction between nominal and actual energy becomes important in system design. Nominal energy assumes rated capacity and average voltage, while actual energy accounts for real operating conditions. A lithium-ion battery pack nominally rated at 75 kWh might deliver only 65 kWh in an electric vehicle due to temperature variations, discharge rate changes, and aging effects over time. System designers must account for these derating factors to ensure sufficient energy availability.

Aging affects both capacity and energy metrics over time. Capacity fade describes the reduction in maximum available charge, while energy fade includes both capacity loss and possible voltage drop. A lithium-ion battery might retain 80% of its initial capacity after 500 cycles but show 75% energy retention due to increased internal resistance lowering the operating voltage. Different chemistries age through distinct mechanisms - lithium-ion experiences solid electrolyte interface growth and lithium inventory loss, while nickel-based batteries suffer from electrode swelling and electrolyte depletion.

Measurement precision requires controlled laboratory conditions. Industry-standard test equipment maintains temperature within ±1°C and current within ±0.1% of set values during capacity and energy measurements. Automated test systems record voltage, current, and temperature at high frequency to integrate capacity precisely and calculate energy accurately. Field measurements under variable conditions inevitably show more variance than laboratory tests.

Understanding these concepts enables proper battery selection and performance expectation setting. A battery specified with 100 Wh/kg at 0.2C and 25°C will not deliver equivalent performance in a high-power application at low temperatures. Engineers must examine the test conditions behind manufacturer specifications and adjust for their specific application requirements. The ongoing development of battery technologies continues to push the boundaries of these parameters, with each new chemistry offering different tradeoffs between capacity, energy density, and operating conditions.
Back to Battery terminology