Hey rchtk, nice data! How did you acquire it?
Hi Nathan. Nice data and plots too
I haven't been spending much time lately on Android but you did a great analysis so I have to answer to your few questions. I'll read later in detail to adapt my analysis method with your ideas.
All data given by the kernel are sampled using a hacky munin-node I coded from scratch (munin because... I know this stuff). That means, you need a munin server to fetch this node. So while my phone is in my house, all battery parameters are sampled every 5mns. Server fetches them, records in RRD. Then graphs are done within the browser from these RRD data. All that automatically, I have several weeks of data. Currently x axis is always time. I also did a few combined graphs (instantaneous capacity vs displayed capacity, voltage vs open circuit voltage and open circuit voltage vs soc)
Code is
there.
You'll need termux, apt install perl and be able to run perl script as root (LD_LIBRARY tricks)
Especially the internal resistance measurement. Was that calculated?
From the kernel (/sys/class/power_supply/bms/resistance), I guess calculated from the sensed temperature (probably the typical hyperbole or 2 3-degres polynoms as a good and fast approximation). I haven't done a combined T/R graph but one can see the relation from the two seperate plots.
Code is a pain to read (the call graph is just crazy). The files involved are:
drivers/power/battery_current_limit.c
drivers/power/bcl_peripheral.c
drivers/power/msm_bcl.c
drivers/power/power_supply_core.c
drivers/power/power_supply_leds.c
drivers/power/power_supply_sysfs.c
drivers/power/qpnp-fg.c
drivers/power/qpnp-smbcharger.c
drivers/power/smb1351-charger.c
drivers/power/smb349-dual-charger.c
I thought about activating more debug in the kernel but didn't manage and got bored by the huge codebase. Not even talking about vendor blobs..
Models usually will clamp the max charge current at low (and sometimes high) temperature. The first low threshold is 10 degrees for the 6P model.
Since I solved my calibration problem I haven't been much further.
Offset appears where SOC is wrongly calculated leading to wrong 0 or 100%.
My battery was loosing 1% health in AccuBattery everyday. That was just not physically possible (procured EOL is 80% IIRC so my phone would go from perfect to dead in 20 days lol)
I did full emptiness (that is going to 0% with a very low dischargeC) and full charge (again very low dischargeC and staying 1h30 at 100%). And I'm back at 100% health... It's actually funny to see the current still high while it shows 100%. So the BMS works, it still full charges but display shows 100% SOC while it is in fact 92%.
Not exactly the "cold bug" but I wanted to shared my discoveries:
* A threshold to limit to 1A is hardcoded at 10 degres. Down to 0 degrees, I haven't seen anything special but I wasn't draining much (that is, 0 degres was also ambiant temperature). BT from the freezer is still to do
* Calibration helped for my SOC offset bug
* From my analysis it comes from high usage where the models begins to get flawed.
* I have some suspicion about the current sensor, it seems very noisy (or BMS makes it fluctuate a lot?) And that's THE input of the battery model..
Since then I can confirm that I need to do this recalibration from time to time.
Testing needs a fixed protocol (always same discharge and same temperature) but well, I also want to use my phone
Nice findings with the commits. I did see these lines in the code and thought the same