Quote:
Originally Posted by ProDigit
No need for calibration. It knows any voltage below 2.8V is essentially cut off, and 4.2V is high.
It calibrates from reading the voltage and power drop over time.
When using the battery, the drop from 4.2 volt to 3.5V will take about the same time as from 3.5V to 2.8V, and that's all it needs to know.
|
First, I agree that at a given load, lithium cell voltage is directly related to state of charge, but voltage decreases as load increases and can increase when current decreases, so battery state of charge is more complicated than than a simple voltage measurement.
Second, I do not know whether the current state of battery management benefits from or uses any on board self-calibration.
But it is simply not true that a constant average load causes the battery voltage to decrease linearly with time.
Let's say a hypothetical "idle" but "awake" Kindle uses 140 milliwatts. That is not some official number, but it is close enough for illustration purposes. The Kindle power supply converts the battery voltage to the constant voltage that the Kindle needs to operate.
For the battery to supply 140mw requires:
33.333 ma @ 4.2 V
40.000 ma @ 3.5 V
50.000 ma @ 2.8 V
so the voltage drop per unit time is continuously increasing and there is no way that
"the drop from 4.2 volt to 3.5V will take
about the same time as from 3.5V to 2.8V". For an actual battery, the above calculations are too simplistic and optimistic, because as current increases, more power is lost to the internal resistance of the battery. The total mah extractable from an alkaline cell is much much higher at 1 ma than 500 ma.