Originally Posted by tech_head
Actually not weird for the tolerance.
The voltage probably has some tolerance. So P=IR.
That is Power is equal to Voltage times Current.
So 2.460x3.8v = 9.4Wh
If we back into the number then we will assume 3.8V is the minimum voltage from the battery based on the specs.
Then that means you could have a voltage as high as 3.95 from the battery based on 10Wh.
The bottom line is that to get to 10Wh either the voltage or current capacity has to go up.
2.46ah x 3.8v = 9.348Wh. So I think you are rounding the wrong way.
I considered that variable voltage may be responsible for the discrepancy, and although there was some merit, I wasn't really convinced.
My EV30 battery has a voltage range of 3.51v -> 4.33v during discharge. This would produce an average voltage of 3.92v, assuming an even slope. If the slope were convex (i.e., voltage drops more slowly at higher voltages) the average could indeed be 3.95v.
The problems I have with the idea is:
1) voltage is controlled by the onboard electronics. Others might tell me if this isn't the case, but every EV30 battery should have _exactly_ the same voltage range (determined by controlling electronics), even if the capacities vary (determined by electrode materials). Therefore voltage cannot contribute to the delta between "min" and "typ" batteries.
2) why would the "min" Wh be calculated from 1 arbitrary voltage level (the one displayed on the label) and the "typ" be calculated from some other arbitrary voltage level (3.95v) that's not listed anywhere?
an interesting academic discussion, and perhaps one day someone in the battery manufacturing industry can confirm exactly how the numbers on the label were derived?