Nexus 6p Cold bug

Search This thread
Ambiant or battery temperature?
I got off the train (+20C ambient) and cycled back home. The phone is in a plastic case when I'm doing so, attached to the handlebars. The ride took me about 15 min and the weather report said it was 3-4C degrees outside. The case felt cold in touch but I had no app to check the the temp. This is not the first time it happened. At first I thought this is also linked to BT or audio stream as each time it was cold, moist and I was listening to music while cycling.
 

rchtk

Senior Member
Mar 7, 2012
900
286
Nice
I got off the train (+20C ambient) and cycled back home. The phone is in a plastic case when I'm doing so, attached to the handlebars. The ride took me about 15 min and the weather report said it was 3-4C degrees outside. The case felt cold in touch but I had no app to check the the temp. This is not the first time it happened. At first I thought this is also linked to BT or audio stream as each time it was cold, moist and I was listening to music while cycling.
Oh I see that's probably what I was talking about above. Combined extreme temp + (dis)charge will affect a lot state of charge. Once it was warm, did you check what battery capacity your OS reported?

My battery shows 7 degres in my fridge lol
Results:
Estimated resistance change with temperature drop (not sure about the physical unit)
It has no influence on estimated SOC.
My battery health still shows Good (I expected maybe Cool?)
Charge current is limited from 3A to 1A at ~10 degrees.
Time to stream bluetooth from my freezer :)
 
Last edited:

plemen

Senior Member
Happened to me a few times in the last two weeks. We've had a cold spell in Vancouver for the last few weeks and the temp has been below 0 (-8 to 0). I was taking pictures of the kids sledding outside and the phone shutdown after 30 minutes or so... I had over 60% battery when this happened.

Very strange.
 

Faexer

Member
Dec 26, 2013
10
0
Got mine yesterday. We hace about 5 degrees Celsius outside. It always reboots. Inside with bout 22 degrees no probs. When i unpacked it iupdated it directly, so i don't know, whether it came with 7.1 or it was there all the time. Is there a possibility to downgrade without rooting it?
 

Kappa Lamda

Senior Member
Jun 28, 2010
819
236
Patras
Google Pixel 6 Pro
Had the same issue last week in NY. Used the phone to take pics. Temperature were from 5-6 celsius during day to 0 and -1 during night. Phone after 10 minutes of use in that coldwas always shutting down and could not open for 10 minutes. After opening instead of showing the percentage it had(for example 65% or 80% or 53%), showed 5-8%. If I powered it of, put in my pocket again to warm it and reopen it, battery was at ~25%.
Again this should not be the percentage that battery should have. And also even after 30 minutes of usage for example, it didn't drop. If it booted at 27% it stayed there with 27% remaining after 30 minutes with screen on on Facebook and messenger!

Is there possible to ask for a battery change?
 

csjneek

Senior Member
Oct 18, 2006
293
31
Craiova
I'm on marshmallow Chroma rom and same happened to me. At 27% battery took a few clips and photos outside in cold ~ -10C and poof to 1% then shuts off. Went inside let it still for 20 min or so then I turned it on and battery resuming normal from 25%.
 

Nathan-K

Member
Nov 28, 2016
14
32
Bay Area, CA
plus.google.com
My battery shows 7 degres in my fridge lol
Results:
Estimated resistance change with temperature drop (not sure about the physical unit)
It has no influence on estimated SOC.
My battery health still shows Good (I expected maybe Cool?)
Charge current is limited from 3A to 1A at ~10 degrees.
Time to stream bluetooth from my freezer :)

Hey rchtk, nice data! How did you acquire it? Especially the internal resistance measurement. Was that calculated?

I ran a series of experiments on my Nexus 6P using ADB dumpsys batteryproperties and ADB dumpsys battery. I plotted my data here. My conclusions were there is a calibration failure in the software, plus some hardware issues. I used Thevenin Equivalent Circuit modeling and noticed an excessive internal resistance may be contributing to this issue. There are also some "meta" interactions (like Doze low-power states, aluminum body design, lack of global real-world testing during design stage, etc.) However, I am speculating since I cannot falsify this data by confirming my suspicions/measuring it directly. I also noticed the sensor poll rate crashes when BatterySaver kicked on, potentially ruining the "emergency shutdown" logic you mentioned. (This would be a "software problem".)

(Part A) https://plus.google.com/102612254593917101378/posts/dpUd5Bm8Df5
(Part B) https://plus.google.com/u/0/102612254593917101378/posts/GCLcr4LqTPo

(Shutoff logged live) https://plus.google.com/u/0/102612254593917101378/posts/SjPzsvmQp4y
(Context) https://plus.google.com/u/0/102612254593917101378/posts/aTCSVkBC7ra

Please see this 4d surface graph I generated illustrating the 2 discharge runs:
("no line" dataset = no heatsinking -- mostly "peach" colored since was restricted current by thermal throttling)
("line-connected" dataset = icepack sinking/thermal reservoir -- mostly "red" colored since power was not throttled)

https://plot.ly/~Nathan-K/2/
  • Temperature (X axis)
  • Actual capacity (based off coulomb charge counter) (Y axis)
  • Cell voltage (Z axis)
  • Current draw (Color intensity = more)

ThreeCurves.png


Please note when loaded at a low temperature, the Coulomb counter -> "% level" reading is wildly inaccurate. It gets stuck at 10% for over an hour, despite 1A of load. So there is something noteworthy about that level. Can you identify what that is in the code?

Second, note how close the "heatsunk" dataset lies to the "3.30v" emergency shutoff value discovered in the "live capture" of a shutdown event. So this might be the "magic" cell voltage value you are seeking.

There are other observation buried in here -- including the max current draw allowed when thermal throttling kicks in, and at what temperatures -- but I allow you to come to your own conclusions. (The more eyes the better!) Please ask if you are curious about mine.

All data and methods are described in the post. I use a Wifi AFB connection and some BASH scripts. You need no specialized tooling to run these tests. Perhaps you can replicate? I am not at liberty to run these experiments due to COI from being a Google Top Contributor in Nexus/Pixel devices. I've actually actively avoided investigating in this topic as a TC to try to keep my data "clean". So independent tests like yours are invaluable.

[Citations]
(1) http://batteryuniversity.com/learn/article/rising_internal_resistance
(2) http://www.ibt-power.com/Battery_packs/Li_Ion/Lithium_ion_tech.html
(3) http://www.richtek.com/en/Design Support/Technical Document/AN023
(4) http://cdn.instructables.com/FVG/29GE/HY12L22N/FVG29GEHY12L22N.LARGE.jpg
(5) http://www.che.sc.edu/faculty/popov/drbnp/WebSite/Publications_PDFs/Web33.pdf

NON-HEATSINKED discharge run (no battery saver)
Drainage_HOT_fixed.png


ICE PACK HEATSINKED discharge run (battery saver kicked in at 15% and was turned off, but still damaged sensor data. Note how there are "two" [% Level vs Charge Counter] graphs.)
l66cEcvFUDb_UVrxKpQeMLTlpC3vqbx8tVDiJMeNmBEQM70L5Zv1g5GLaq9DeaQz7zYSOWB3Kw=w1195-h866-no
 
Last edited:

Nathan-K

Member
Nov 28, 2016
14
32
Bay Area, CA
plus.google.com
The battery model uses inputs from current flowing in/out in the battery, outter voltage and battery temperature sensors and calculates Open Circuit Voltage leading to the estimated State Of Charge (by use of a 3 degrees polynom)
(...)
I've gathered this information from the Qualcomm IC driver source

This file hasn't been updated in a long while. That greatly concerns me. Battery technology has definitely progressed since 2014. This file has not been updated to reflect that. Please see the most-recent version I linked below.

Also, judging by the fact commits like this exist, I am greatly concerned regarding the robustness of their "polynomial" modeling: I get the feeling this battery controller was "let's ship it and move on". They didn't put in the additional effort necessary to make their solution robust. (These are merely comments from the peanut gallery.)
https://android.googlesource.com/kernel/msm/+/7e797b6ac23417a526a7e663b49afe0d96cd74be
https://android.googlesource.com/ke...^2..460f0b6c67940ae25301767410293211435abc2f/

Translation: If you have to "fake" your data, and "clamp" it to values, it means your simulation function is wrong! The function should ideally only generate SOC values from 0<->100 in the first place. If it is outputting negative values, or values over 100, it means your function is a poor descriptor of what you are trying to model. Especially at edge cases as we are encountering here.

(Source)
https://android.googlesource.com/ke...er-3.10-nougat-mr1.1/drivers/power/qpnp-bms.c
https://www.codeaurora.org/about-us
 
Last edited:

rchtk

Senior Member
Mar 7, 2012
900
286
Nice
Hey rchtk, nice data! How did you acquire it?
Hi Nathan. Nice data and plots too :)
I haven't been spending much time lately on Android but you did a great analysis so I have to answer to your few questions. I'll read later in detail to adapt my analysis method with your ideas.

All data given by the kernel are sampled using a hacky munin-node I coded from scratch (munin because... I know this stuff). That means, you need a munin server to fetch this node. So while my phone is in my house, all battery parameters are sampled every 5mns. Server fetches them, records in RRD. Then graphs are done within the browser from these RRD data. All that automatically, I have several weeks of data. Currently x axis is always time. I also did a few combined graphs (instantaneous capacity vs displayed capacity, voltage vs open circuit voltage and open circuit voltage vs soc)
Code is there.
You'll need termux, apt install perl and be able to run perl script as root (LD_LIBRARY tricks)
Especially the internal resistance measurement. Was that calculated?
From the kernel (/sys/class/power_supply/bms/resistance), I guess calculated from the sensed temperature (probably the typical hyperbole or 2 3-degres polynoms as a good and fast approximation). I haven't done a combined T/R graph but one can see the relation from the two seperate plots.

Code is a pain to read (the call graph is just crazy). The files involved are:
drivers/power/battery_current_limit.c
drivers/power/bcl_peripheral.c
drivers/power/msm_bcl.c
drivers/power/power_supply_core.c
drivers/power/power_supply_leds.c
drivers/power/power_supply_sysfs.c
drivers/power/qpnp-fg.c
drivers/power/qpnp-smbcharger.c
drivers/power/smb1351-charger.c
drivers/power/smb349-dual-charger.c
I thought about activating more debug in the kernel but didn't manage and got bored by the huge codebase. Not even talking about vendor blobs..

Models usually will clamp the max charge current at low (and sometimes high) temperature. The first low threshold is 10 degrees for the 6P model.

Since I solved my calibration problem I haven't been much further.
Offset appears where SOC is wrongly calculated leading to wrong 0 or 100%.
My battery was loosing 1% health in AccuBattery everyday. That was just not physically possible (procured EOL is 80% IIRC so my phone would go from perfect to dead in 20 days lol)
I did full emptiness (that is going to 0% with a very low dischargeC) and full charge (again very low dischargeC and staying 1h30 at 100%). And I'm back at 100% health... It's actually funny to see the current still high while it shows 100%. So the BMS works, it still full charges but display shows 100% SOC while it is in fact 92%.

Not exactly the "cold bug" but I wanted to shared my discoveries:
* A threshold to limit to 1A is hardcoded at 10 degres. Down to 0 degrees, I haven't seen anything special but I wasn't draining much (that is, 0 degres was also ambiant temperature). BT from the freezer is still to do :)
* Calibration helped for my SOC offset bug
* From my analysis it comes from high usage where the models begins to get flawed.
* I have some suspicion about the current sensor, it seems very noisy (or BMS makes it fluctuate a lot?) And that's THE input of the battery model..
Since then I can confirm that I need to do this recalibration from time to time.
Testing needs a fixed protocol (always same discharge and same temperature) but well, I also want to use my phone :)

Nice findings with the commits. I did see these lines in the code and thought the same :)
 
Last edited:

murso74

Senior Member
Oct 30, 2010
1,881
356
I got a refurbished phone a few months ago because my battery was doing at 15%. Was outside today, I want to say it was mid 20s F and my phone died at 24% battery. Plugged it in at home and it was at 1%. That's pretty common temperature for NYC this time of year. Not sure if I should contact Google or not

Sent from my Nexus 6P using Tapatalk
 

Top Liked Posts

  • There are no posts matching your filters.
  • 6
    My battery shows 7 degres in my fridge lol
    Results:
    Estimated resistance change with temperature drop (not sure about the physical unit)
    It has no influence on estimated SOC.
    My battery health still shows Good (I expected maybe Cool?)
    Charge current is limited from 3A to 1A at ~10 degrees.
    Time to stream bluetooth from my freezer :)

    Hey rchtk, nice data! How did you acquire it? Especially the internal resistance measurement. Was that calculated?

    I ran a series of experiments on my Nexus 6P using ADB dumpsys batteryproperties and ADB dumpsys battery. I plotted my data here. My conclusions were there is a calibration failure in the software, plus some hardware issues. I used Thevenin Equivalent Circuit modeling and noticed an excessive internal resistance may be contributing to this issue. There are also some "meta" interactions (like Doze low-power states, aluminum body design, lack of global real-world testing during design stage, etc.) However, I am speculating since I cannot falsify this data by confirming my suspicions/measuring it directly. I also noticed the sensor poll rate crashes when BatterySaver kicked on, potentially ruining the "emergency shutdown" logic you mentioned. (This would be a "software problem".)

    (Part A) https://plus.google.com/102612254593917101378/posts/dpUd5Bm8Df5
    (Part B) https://plus.google.com/u/0/102612254593917101378/posts/GCLcr4LqTPo

    (Shutoff logged live) https://plus.google.com/u/0/102612254593917101378/posts/SjPzsvmQp4y
    (Context) https://plus.google.com/u/0/102612254593917101378/posts/aTCSVkBC7ra

    Please see this 4d surface graph I generated illustrating the 2 discharge runs:
    ("no line" dataset = no heatsinking -- mostly "peach" colored since was restricted current by thermal throttling)
    ("line-connected" dataset = icepack sinking/thermal reservoir -- mostly "red" colored since power was not throttled)

    https://plot.ly/~Nathan-K/2/
    • Temperature (X axis)
    • Actual capacity (based off coulomb charge counter) (Y axis)
    • Cell voltage (Z axis)
    • Current draw (Color intensity = more)

    ThreeCurves.png


    Please note when loaded at a low temperature, the Coulomb counter -> "% level" reading is wildly inaccurate. It gets stuck at 10% for over an hour, despite 1A of load. So there is something noteworthy about that level. Can you identify what that is in the code?

    Second, note how close the "heatsunk" dataset lies to the "3.30v" emergency shutoff value discovered in the "live capture" of a shutdown event. So this might be the "magic" cell voltage value you are seeking.

    There are other observation buried in here -- including the max current draw allowed when thermal throttling kicks in, and at what temperatures -- but I allow you to come to your own conclusions. (The more eyes the better!) Please ask if you are curious about mine.

    All data and methods are described in the post. I use a Wifi AFB connection and some BASH scripts. You need no specialized tooling to run these tests. Perhaps you can replicate? I am not at liberty to run these experiments due to COI from being a Google Top Contributor in Nexus/Pixel devices. I've actually actively avoided investigating in this topic as a TC to try to keep my data "clean". So independent tests like yours are invaluable.

    [Citations]
    (1) http://batteryuniversity.com/learn/article/rising_internal_resistance
    (2) http://www.ibt-power.com/Battery_packs/Li_Ion/Lithium_ion_tech.html
    (3) http://www.richtek.com/en/Design Support/Technical Document/AN023
    (4) http://cdn.instructables.com/FVG/29GE/HY12L22N/FVG29GEHY12L22N.LARGE.jpg
    (5) http://www.che.sc.edu/faculty/popov/drbnp/WebSite/Publications_PDFs/Web33.pdf

    NON-HEATSINKED discharge run (no battery saver)
    Drainage_HOT_fixed.png


    ICE PACK HEATSINKED discharge run (battery saver kicked in at 15% and was turned off, but still damaged sensor data. Note how there are "two" [% Level vs Charge Counter] graphs.)
    l66cEcvFUDb_UVrxKpQeMLTlpC3vqbx8tVDiJMeNmBEQM70L5Zv1g5GLaq9DeaQz7zYSOWB3Kw=w1195-h866-no
    3
    is there a particular reason you guys do this stuff?



    So we don't have to.
    3
    The battery model uses inputs from current flowing in/out in the battery, outter voltage and battery temperature sensors and calculates Open Circuit Voltage leading to the estimated State Of Charge (by use of a 3 degrees polynom)
    As any resistor, the internal battery resistance will vary depending on temperature. So will the OCV calculation.
    All these parameters are linked non linearly and the models are done within temperature ranges (different coefficients for say different temperatures). If temperature exceeds thresholds ( /sys/class/power_supply/bms shows a nominal range of [10 degC, 45 degC] ), the SoC will not be computed.
    If temperature exceeds thresholds or SoC varies too widely, the battery current will be downlimited, meaning it won't charge as expected or could shutdown for safety reasons.

    If battery temperature goes under 5 degC, the SoC is considered invalid.
    As such when starting again the device within this range the SoC will be calculated again correctly and the device will stay on.
    There are also high/low safety thresholds which will instantaneously shut off the device (I think they are stored in the Qualcomm IC so I can't see these values).

    So while the battery can probably work fine under 5 degres C, the SoC won't be calculated properly. You can also get some feedback from the known battery health in /sys/class/power_supply/battery/health (Good, Dead, Warm, Cool, Cold, Overheat, ...)

    I've gathered this information from the Qualcomm IC driver source
    1
    Seems the Nexus 6p has a cold bug. First phone i seen this in.

    So i been doing some overclocking and bench marking on a few devices. The Nexus 6p has shown some strange behavior. When i had the phone chilled and ran some benchmarks the battery would drop drastically. Restart and all that did nothing. The batt. stayed at 39% for about 2 hours with some heavy use. Charged it back up no issues. Did it again and it did the same thing. Took my turbo 2 to an even lower temp and that was fine it did not have this.


    Something you have to worry about? Nope not unless you are out in Alaska or taking a trip to the artic circle. However if you do this could cause you some issues.


    Tested on a stock room and kernel. As well as a purenexus and elemental x combo. Same outcome.


    Again not really an issue. And the Nexus camera visor and display survived the -40c freeze and back to room temp test. Attached is a pic of the massive drop
    1
    Why would we 'have to'?

    Perhaps gives an endurance threshold for those of us hiking in Alaska, or climbing an icy peak, ect.