Every solar installer has faced the same tension. The homeowner wants the longest possible backup runtime, so the designer programs the battery to discharge down to 10% state of charge — a 90% depth of discharge. That battery, cycled daily at 90%, will degrade faster than an identical unit held at 80%. But drop the limit to 50% and calendar aging starts to erode the economics from the other direction. The unused capacity sits on the wall, degrading quietly, while the owner still paid for every kilowatt-hour of nameplate rating.
There is no free lunch. There is only an optimal setpoint. This guide gives you a chemistry-agnostic framework for choosing depth of discharge per project type, backed by real cycle-life data and a worked LCOS example you can drop straight into a client proposal.
TL;DR
For LFP batteries, 80–90% DoD typically minimizes LCOS in residential daily-cycling scenarios. NMC performs best at 60–80% DoD. Lead-acid should rarely exceed 50%. Lower DoD extends cycle life but increases calendar-aging cost per kWh if the battery sits underutilized. The optimal setting depends on chemistry, cycling frequency, and project economics — not manufacturer specs alone.
In this guide:
- The exact formulas for DoD, SoC, and usable kWh
- Cycle-life tables for LFP, NMC, and lead-acid at varying DoD levels
- A worked LCOS example comparing 80% vs. 90% DoD on a 10 kWh LFP system
- Why shallow cycling can raise — not lower — lifetime cost per kWh
- Manufacturer warranty floors vs. real-world optimal setpoints
- Residential vs. commercial DoD strategy
What Is Depth of Discharge?
Depth of discharge (DoD) is the percentage of a battery’s total nameplate capacity that has been discharged during a cycle. It is the single most important operational parameter for sizing solar storage because it directly determines how much energy the owner can actually use, how long the battery will last, and what each stored kilowatt-hour truly costs over the system’s life.
For a battery with a rated capacity of 10 kWh, a depth of discharge of 80% means 8 kWh has been withdrawn. The remaining 2 kWh stays in reserve. DoD is usually expressed as a percentage, and it relates to state of charge (SoC) through a simple identity: DoD = (1 − SoC) × 100%. An alternative form, more common in field calculations, is DoD = (Discharged Ah / Rated Ah) × 100%.
Nameplate capacity and usable capacity are not the same. A 10 kWh lithium battery may only expose 9 kWh to the inverter as “usable” energy, with the manufacturer reserving the top and bottom 5% to protect cell chemistry. When an installer sets a custom DoD limit in the energy management system, they are typically defining how much of that usable window can be tapped. Misunderstanding this distinction is a frequent source of warranty disputes and undersized backup designs.
DoD matters more in solar than in electric vehicles or consumer electronics for three reasons. First, solar batteries cycle daily — 300 to 365 cycles per year — so small changes in per-cycle wear compound rapidly. Second, the financial horizon is 10 to 20 years, making end-of-life prediction central to project returns. Third, manufacturer warranties enforce strict throughput and retention floors; breaching a recommended DoD can void protection even if the total energy limit has not been reached. Accurate battery modeling is a core strength of modern solar software, and DoD is the input that drives every subsequent calculation.
DoD vs. SoC: The Numbers Your BMS Reports
State of charge (SoC) and depth of discharge are two sides of the same coin. SoC tells you how much energy remains. DoD tells you how much has left. In a well-calibrated system, they always sum to 100%. If the battery management system (BMS) reports 30% SoC, the battery has undergone 70% DoD since its last full charge.
The formula is straightforward:
SoC = 100% − DoD
This relationship holds at the cell level, but BMS reporting can introduce confusion. Some manufacturers report a “usable” SoC range that maps 0–100% to the customer-accessible window, while the absolute SoC might run from 5% to 95% of true cell capacity. A Tesla Powerwall might show 5% “reserve” at the bottom, meaning the BMS-reported 0% SoC is actually 5% absolute. Other vendors expose absolute SoC directly. An installer who programs an 80% DoD limit without knowing which scale the BMS uses may accidentally push the cells far harder than intended.
| SoC | DoD | Usable kWh (10 kWh nameplate) |
|---|---|---|
| 100% | 0% | 0 |
| 90% | 10% | 1.0 |
| 80% | 20% | 2.0 |
| 50% | 50% | 5.0 |
| 20% | 80% | 8.0 |
| 10% | 90% | 9.0 |
| 0% | 100% | 10.0 |
Misreading BMS scales causes real problems. A warranty may require that the battery never falls below 10% absolute SoC. If the BMS displays usable SoC and the installer sets a 90% DoD on that scale, the cells could be hitting 0% absolute — a condition that voids coverage. Before commissioning, always confirm with the manufacturer whether the BMS reports usable or absolute state of charge. Document the setting in the handover pack. If you are preparing a proposal that specifies battery behavior, solar proposal software can help you model the correct usable window from the start.
How DoD Affects Battery Cycle Life: Chemistry-by-Chemistry Tables
The relationship between DoD and cycle life is non-linear and chemistry-specific. Deeper discharge accelerates electrode degradation, lithium plating, and electrolyte breakdown. The tables below show approximate real-world values drawn from CATL and BYD datasheets, Battery University aggregated data, and NREL laboratory studies.
Lithium Iron Phosphate (LFP)
LFP dominates residential solar because of its thermal stability and long cycle life. The trade-off is lower energy density, but for a fixed wall-mount installation that rarely matters.
| DoD | Approx. Cycle Life | Usable kWh (10 kWh) | Lifetime Throughput |
|---|---|---|---|
| 100% | ~2,000 | 10.0 | 20,000 kWh |
| 90% | ~3,500 | 9.0 | 31,500 kWh |
| 80% | ~5,000 | 8.0 | 40,000 kWh |
| 50% | ~8,000+ | 5.0 | 40,000+ kWh |
Notice the pattern: dropping from 100% to 80% DoD more than doubles cycle life and doubles lifetime throughput. The 80% row is the sweet spot for most residential systems. At 50% DoD, cycle life extends dramatically, but throughput stalls because each cycle delivers only half the energy.
Nickel Manganese Cobalt (NMC)
NMC offers higher energy density and is common in compact all-in-one storage systems. It is more sensitive to deep discharge than LFP.
| DoD | Approx. Cycle Life | Usable kWh (10 kWh) | Lifetime Throughput |
|---|---|---|---|
| 100% | ~500 | 10.0 | 5,000 kWh |
| 80% | ~1,000 | 8.0 | 8,000 kWh |
| 60% | ~2,000 | 6.0 | 12,000 kWh |
| 50% | ~3,000+ | 5.0 | 15,000+ kWh |
NMC at 100% DoD delivers roughly one-quarter the cycle life of LFP at the same depth. For daily-cycling residential applications, NMC should almost always be programmed to 80% DoD or less. The chemistry simply cannot tolerate the same depth of stress.
Lead-Acid
Flooded and AGM lead-acid batteries still appear in off-grid and budget-conscious installations. Their cycle-life penalty for deep discharge is severe.
| DoD | Approx. Cycle Life | Usable kWh (10 kWh) | Lifetime Throughput |
|---|---|---|---|
| 50% | ~400 | 5.0 | 2,000 kWh |
| 30% | ~1,200 | 3.0 | 3,600 kWh |
Lead-acid throughput economics are poor compared with lithium. A 10 kWh lead-acid bank at 50% DoD delivers 2,000 kWh over its life. A 10 kWh LFP battery at 80% DoD delivers 40,000 kWh — twenty times more energy per kilowatt-hour of nameplate capacity. Sulfation accelerates when lead-acid cells sit at partial states of charge, which is exactly what happens in solar applications with irregular cycling. Most professional installers have moved to lithium for this reason alone.
Calculating Usable kWh at Different DoD Settings
Practical battery sizing starts with the usable energy equation:
Usable kWh = Nameplate kWh × DoD%
Take a 10 kWh LFP battery as a worked example:
- At 90% DoD: 10 × 0.90 = 9.0 kWh usable
- At 80% DoD: 10 × 0.80 = 8.0 kWh usable
The 80% setting sacrifices 1.0 kWh per night — an 11% reduction in daily capacity. That may sound minor, but it has a direct sizing implication. If the home’s nightly load is 9 kWh, a 90% DoD setting allows a 10 kWh battery to cover the load. At 80% DoD, the same home needs an 11.25 kWh battery to avoid a shortfall. On projects where inverter compatibility or wall space limits the battery count, that 11% can force a design change.
Lifetime throughput adds the cycle-life dimension:
Lifetime Throughput = Nameplate × DoD × Cycles × (1 − Degradation Factor)
The degradation factor accounts for the fact that a battery does not deliver its day-one capacity for its entire life. For warranty modeling, a degradation factor of 0.8 to 0.9 is typical. A battery with 5,000 cycles at 80% DoD and a 0.85 degradation factor delivers:
10 kWh × 0.80 × 5,000 × 0.85 = 34,000 kWh effective lifetime throughput
These are the numbers that feed directly into LCOS. Good solar design software automates this arithmetic, but every installer should understand the mechanics so they can sanity-check the output and explain it to a client.
The Shallow Cycling Paradox: When Lower DoD Hurts More Than It Helps
It seems intuitive that lower DoD should always reduce cost per kilowatt-hour. The battery lasts longer, so the fixed capital cost spreads across more cycles. But there is a counterintuitive limit: the shallow cycling paradox.
Calendar aging operates independently of cycle count. A lithium battery degrades chemically whether it is cycling or not. Temperature, time, and state-of-charge window all drive capacity fade. If you set an LFP battery to 50% DoD, each cycle delivers only 5 kWh. With 8,000+ theoretical cycles, the battery would need more than 22 years of daily cycling to reach its cycle-life limit. No residential warranty extends that far. Calendar aging — not cycle wear — will end the battery’s useful life first.
The owner paid for 10 kWh of capacity but only ever used 5 kWh per day. The stranded 5 kWh degraded on the wall and produced no value. The result is a higher effective cost per kilowatt-hour than a moderately deeper cycling regime would have achieved.
Shallow cycling still makes sense in specific applications. Commercial peak-shaving systems may only cycle 50 to 150 times per year. A 30% DoD limit, combined with a large nameplate bank, can minimize wear while capturing tariff arbitrage. Seasonal backup systems that sit idle for months should use the lowest DoD that meets the autonomy requirement. But for residential daily self-consumption, 50% DoD is usually too shallow. The optimal point sits where the marginal gain in cycle life still outpaces the marginal loss from calendar aging and stranded capacity.
Manufacturer DoD Recommendations vs. Real-World Best Practice
Manufacturer datasheets optimize for warranty survival, not necessarily for the lowest lifetime cost per kilowatt-hour. Understanding the difference protects your margin and your client’s long-term returns.
- Tesla Powerwall: Rated for 100% DoD. The warranty is throughput-based (unlimited cycles within an energy threshold), so the DoD floor is less relevant than total energy delivered. In practice, daily cycling at 100% accelerates degradation beyond the warranty curve.
- LG ESS Home: Recommended 90% DoD. Warranty requires 60% capacity retention at 10 years. Pushing the battery to 95% DoD regularly risks falling below that retention floor before the warranty expires.
- BYD Battery-Box: 90% recommended DoD in most system designs. The BMS allows deeper discharge, but sustained operation above 90% voids certain warranty protections.
- CATL: 80% recommended DoD in most integrated system designs. CATL cells can technically survive deeper cycling, but the warranty assumes an 80% operating window.
The key risk is the warranty claim. Cycling beyond the manufacturer’s recommended DoD may void protection even if the total throughput limit has not been reached. A client who reads “100% DoD rated” on a spec sheet and demands the maximum backup runtime may not understand that the warranty assumes a narrower operating band. Document the DoD setting in the commissioning report and have the client acknowledge it.
Model Battery Performance in Your Solar Projects
SurgePV’s generation and financial tool runs degradation scenarios and lifetime yield calculations automatically.
Book a DemoNo commitment required · 20 minutes · Live project walkthrough
LCOS Optimization: Finding the DoD That Minimizes Lifetime Cost per kWh
Levelized cost of storage (LCOS) brings together capital cost, operating cost, and lifetime energy throughput into a single figure: the cost per kilowatt-hour delivered over the battery’s life.
LCOS = (CapEx + PV of O&M) / Lifetime Energy Throughput (kWh)
Consider a 10 kWh LFP battery with an installed cost of $8,000 and a 3% discount rate. O&M is minimal for lithium, so we focus on capital and throughput.
Scenario A — 90% DoD:
- Cycles: ~3,500
- Throughput: 3,500 × 9.0 kWh = 31,500 kWh
- LCOS: $8,000 / 31,500 = $0.254/kWh
Scenario B — 80% DoD:
- Cycles: ~5,000
- Throughput: 5,000 × 8.0 kWh = 40,000 kWh
- LCOS: $8,000 / 40,000 = $0.200/kWh
Scenario C — 50% DoD (shallow):
- Theoretical cycles: ~8,000+
- Effective cycles limited by 15-year calendar life: ~5,500
- Throughput: 5,500 × 5.0 kWh = 27,500 kWh
- LCOS: $8,000 / 27,500 = $0.291/kWh
Scenario B wins. The 80% DoD setting delivers the lowest LCOS because it balances extended cycle life against meaningful per-cycle capacity. Scenario A suffers from shorter cycle life. Scenario C suffers from stranded capacity: the owner paid for 10 kWh but only ever extracted 5 kWh per cycle, and calendar aging cut the effective life short before the cycle-life limit was reached.
There are edge cases where 90% DoD LFP yields lower LCOS than 80%. In very high capital-cost markets, or where financing costs dominate, the value of extracting more energy per cycle early in the project can outweigh the faster degradation. The right approach is to plot LCOS against DoD for each chemistry and project type, then choose the minimum. Modern solar software can run these scenarios in seconds.
Residential vs. Commercial DoD Strategy
The optimal DoD depends on how often the battery cycles and what revenue stream it captures.
Residential daily self-consumption: These systems cycle 300 or more times per year. The wear model is dominated by cycle life, not calendar aging. For LFP, 80–90% DoD is usually ideal. For NMC, 60–80% DoD protects the shorter-lived chemistry. The design goal is to maximize self-consumed solar while staying inside the warranty curve.
Commercial peak shaving: Commercial batteries may only cycle 50 to 150 times per year. The revenue comes from demand-charge reduction and time-of-use arbitrage, not daily self-consumption. A 30–50% DoD limit is often sufficient because the tariff spread is captured with shallow draws. Deeper cycling wastes cycle life for marginal savings. In these cases, oversizing the nameplate capacity and running shallow is frequently the cheapest route.
Backup-only systems: The battery may sit at high SoC for months. The correct DoD is the lowest value that still meets the autonomy requirement. If the client needs 8 kWh of backup for a critical loads panel, size the battery to deliver that at 70% DoD rather than 90%. The cells stay healthier, and the system is ready when the grid fails.
Hybrid cases: Advanced energy management systems allow programmable DoD by season or tariff period. The battery can discharge to 90% during summer evenings when solar recharge is guaranteed, then drop to 60% in winter when cycling is lighter. Programmable logic turns a static DoD setting into a dynamic optimization tool.
Accurate modeling of these scenarios is where solar design software delivers the most value. Static rules of thumb get you close; project-specific LCOS plots get you to the right answer.
Conclusion
Depth of discharge is a dial, not a switch. The optimal setting sits at the intersection of cell chemistry, annual cycle count, and the financial model driving the project.
For most residential LFP installations running daily self-consumption, 80–90% DoD hits the LCOS floor. The cycle-life extension from 80% outweighs the capacity sacrifice, while 90% remains defensible when capital costs are high or nightly loads are large. NMC systems should stay at 60–80% DoD to protect their shorter cycle life. Lead-acid belongs below 50% DoD, though most installers have abandoned the chemistry for good reason.
For commercial peak shaving, shallow cycling with a larger nameplate often wins. The battery cycles infrequently, so calendar aging dominates and stranded capacity is less of a penalty. Backup-only systems should use the shallowest DoD that meets autonomy requirements.
The final step is to model the specific project. Manufacturer recommendations protect warranties; LCOS optimization protects returns. SurgePV’s generation and financial tool runs degradation scenarios, lifetime yield calculations, and LCOS curves automatically — so you can set the DoD with confidence and defend the number in the client meeting.
Frequently Asked Questions
What is the optimal depth of discharge for a solar battery?
For lithium iron phosphate (LFP), 80–90% DoD typically minimizes lifetime cost per kWh in residential daily-cycling systems. For NMC, 60–80% DoD is safer. Lead-acid should stay below 50% DoD. The exact optimum depends on cycling frequency and project economics.
How does depth of discharge affect battery life?
Higher DoD stresses electrodes and electrolyte, reducing cycle life. LFP at 100% DoD delivers around 2,000 cycles; at 80% DoD, it reaches around 5,000 cycles. The relationship is non-linear — a 20% reduction in DoD can more than double cycle life.
What is the difference between DoD and SoC?
DoD measures discharged capacity as a percentage of total. SoC measures remaining capacity. They sum to 100%: DoD = 100% minus SoC. A battery at 30% SoC has undergone 70% DoD.
How do you calculate usable kWh from DoD?
Multiply nameplate capacity by the DoD percentage: Usable kWh = Nameplate kWh × DoD%. A 10 kWh battery at 80% DoD delivers 8 kWh usable. Always confirm whether the BMS reports usable or absolute capacity before sizing.
How does DoD affect levelized cost of storage?
Lower DoD extends cycle life and raises lifetime throughput, which lowers LCOS — up to a point. If DoD is too shallow, calendar aging wastes unused capacity and LCOS flattens or rises. The minimum LCOS usually sits between 70% and 90% DoD for LFP.
Should I charge my solar battery to 100%?
For LFP, charging to 100% of the usable BMS range is generally acceptable because the manufacturer already reserves a top buffer. Charging to 100% absolute cell SoC is not recommended for daily cycling. For NMC, holding the battery at 100% for long periods accelerates degradation; a 90–95% charge limit is often better for daily-cycling residential systems.
What DoD do solar batteries actually use in the field?
Residential LFP systems typically operate between 80% and 90% DoD. NMC all-in-one units are often set to 80% or lower by the manufacturer. Commercial peak-shaving systems frequently run at 30–50% DoD because the shallow draw is sufficient for tariff arbitrage and preserves cycle life.
How does DoD interact with battery degradation?
DoD drives cycle degradation — the mechanical and chemical wear from each charge and discharge. Calendar degradation operates in parallel, driven by time, temperature, and average state of charge. High DoD accelerates cycle fade. Very low DoD can increase calendar-aging cost per kWh by leaving capacity stranded. The optimal DoD is where the combined degradation cost is minimized.



