Not to mention, how much does the increased SEI change the impedance of the cell (thus reducing the subsequent charge speed) and the capacity available.
All LiIon and LiPol chemistries have shown the following:
1. deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.
2. high-current discharge or rapid-charging accelerates capacity losses by about 15% a year
3. Internal resistance goes up as dendrite defect shorts damage the cell. Additionally, the self-discharge rates increase as the cell is degraded.
Very surprising if the technique works for all cell chemistries. =3
This study solely focuses on the very first charge. It doesn't claim that recharging at high currents benefits battery life, only that the first charge at high current forms a larger protective barrier than a first charge at a low current.
Other studies have shown that a larger protective barrier improves lifespan. (See other comments on this thread for more details on the science.)
That is, if you do it single time you are down from 8k to 2k? Or it decreases gradually and 2k is the worst case?
Where can I read about it? Not a paper, but something more down to earth for consumers? That is, for a consumer to know how to properly maintain various devices (phone/car) for longevity?
> 1. deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.
Presumably these are NMC variant?
Major Chinese LFP brands come with 6000K/10K cycle guarantees (but with specific operational parameters). Are these cycle predictions unrealistic ?
catl/eve/etc
https://electrek.co/2023/08/29/tesla-battery-longevity-not-a...
> Removing more lithium ions up front is a bit like scooping water out of a full bucket before carrying it, Cui said. The extra headspace in the bucket decreases the amount of water splashing out along the way. In similar fashion, deactivating more lithium ions during SEI formation frees up headspace in the positive electrode and allows the electrode to cycle in a more efficient way, improving subsequent performance.
There are articles that appear here and elsewhere semi-frequently about how doing something simple extends battery lifetimes a huge amount, but those never get implemented in practice except perhaps for highly niche applications.
Instead what usually happens is they'll then find a way to make them last the same amount of time, but with higher energy density. The "high voltage" lion cells (>4.2V end of charge) are an example of that process; they will last much longer than previous types if charged to 4.2V, but they'd rather advertise them as 4.3 or 4.35 or even 4.4V(!) and the extra capacity that gives.
This is a lazy dismissal of any process or efficiency improvements.
If buyers care to pay for efficiency improvements, products with them will be more attractive to them. If they don't, they won't.
If your theory were true, we wouldn't have things like rechargeable batteries, low-energy appliances, or light bulbs that would last more than two months.
There's always some performance point when most people largely stop differentiating products based on efficiency or longevity improvements, and I'm not sure if consumer Li-I batteries are at that point yet.
I'm not into Apple, but I guess that if Apple could have chosen between that "lowering performance on iPhones when the battery capacity was decreased" shit and "precondition the cells to make them last longer", they would have chosen the second and make it very public.
This finding, however, specifically integrates with existing infrastructure; no new, unproven technology is needed, we just simply juice the batteries more during initial charge. If it pans out after extensive testing, we can see this technique hitting the market within 2 years.
That's only true under monopoly conditions.
Fortunately, in capitalism, when there are two more more companies doing things like making phones, those companies actually compete on features. And battery longevity is absolutely a feature consumers care about.
And there's certainly no kind of monopoly conditions in cell phones. Competition is thriving. As it is in most types of portable electronics generally -- Bluetooth speakers, laptops, and so forth.
If you're the company that does it first, that means increased profits because suddenly more people buy your product. And if you're the company that does it last, it means decreased profits because less people will buy your product compared to the competition. That's the invisible hand at work.
If there's a capacity tradeoff, why not use a slightly modified chemistry (like how LTO is, for example)? Though I guess this article was more about the existence of the phenomenon rather than using it.
'risking it for the biscuit"
If discovery slows down capacity degradation, but now your EV battery is 100x more likely spontaneously fail ($$$) - it's not really an improvement. Maybe ok for consumer device tho.
Modern Tesla's show fairly similar long-tail degradation that is nearly identical for cars that strictly home charge and those that only supercharge (based on customer vehicle tracking). Most will level off at 85-90% of original capacity.
2021: low frequency pulsed charging: https://vbn.aau.dk/ws/portalfiles/portal/451327786/C5.pdf
2024: high frequency pulsed charging https://onlinelibrary.wiley.com/doi/10.1002/aenm.202400190
Not up to really reading them right now, but this is a pretty neat area of research!
Regardless of whether this is a "process tweak" or something "truly fundamental", a 50% increase in battery lifespan would be huge, regardless.
The conspiracy theorist in me though thinks that a lot of consumer electronics makers wouldn't like this, because lower battery capacity has to be a big driver of upgrade cycles. I'm guessing a lot of folks are similar to me: these days, somewhere in the 2-3 year mark my cell battery capacity starts degrading noticeably. My phone otherwise works great, and I certainly don't need the features in the latest model phone, and of course I know I can pay for just a battery replacement, but sometimes I think "Well, if I need to replace the battery, I might as well get a new phone - it's got <some feature that is marginally better but that I'm now convincing myself is super cool to justify my not-really-necessary upgrade purchase>".
I think with 50% more battery lifespan I would rarely, if ever, use dwindling battery capacity as an excuse for an upgrade purchase.
Re: consumer electronics, the obvious big example is cell phones, and notably major manufacturers have started adding features to extend battery lifespan by capping charge levels. Samsung has had this for a few years, originally capped at 85% but changed in a recent update down to 80%. I believe this occurred around the same time they extended their software support to 5 years. Apple added the same thing in iOS 17 with the iPhone 15 models, but despite obviously having the hardware capability ("optimized charging") they didn't enable the feature on earlier models.
Biggest things is that 20% degradation in a cell phone means it can't survive a whole day; whereas that difference isn't that noticeable in a vehicle, where you're not running it to 0 anyway, just charging when needed.
What nice things can you do if you take 130% longevity and something else?
That's great that you use it as an excuse to get a new phone, but whatever small percentage of people wind up being motivated to upgrade specifically because their battery doesn't hold enough charge, then gets outweighed by people who will buy one phone over another specifically because it's supposed to maintain its capacity for more years. Capitalism at work.
There's a specialized version of this called BC/RL for battery research as well.
This particular article sits about half way the scale. This was an actual study, with actual batteries, that reportedly actually had improved life spans. So dismissing that with a hand wavy this is all just academic nonsense doesn't quite fly here. But of course from here to production is indeed quite a journey. I bet a lot of companies with active investment in battery R&D are paying attention and might be going to try to replicate the success.
Also worth noting that if you only pay attention to the stuff that is at the highest levels, you basically miss out on new things until they are old news. For example if you have been dismissing solid state batteries, you might have missed the news that they are being used in products now.
Faster than what?
It turns out this is about the very first charge after assembly of the cell, not regular use.
However, I doubt that this finding will be used much, except perhaps in applications like aerospace; it is in manufacturer's economic interests that their products have short lives.
Edit: looks like as usual, comments that expose the truth get buried ;-)
> giving batteries this first charge at unusually high currents increased their average lifespan by 50% while decreasing the initial charging time from 10 hours to just 20 minutes.
That's insane
They don't exist in a vacuum, and batteries are a commodity market. Cheap ways to improve their product seem like an easy win to me.
Demand for batteries is virtually insatiable. The only constraint is the price, and a better quality product can command a higher price.
That's not true for passenger vehicles, particularly for high-spec products sold in the West. Integration of components such a battery cells that have many critical performance parameters is not trivial and manufacturers are not free to substitute cells from commodity markets. EV manufacturers are either making their own battery cells to their own, proprietary standards, or they secure contracts with suppliers capable of making cells with consistent performance. Any change in cell characteristics, including supposed improvements such as the one appearing in this report, must be integrated by the manufacturer and supply must be assured.
It's not really a commodity market, despite appearances and hype.