With the advent of the S4, and its higher capacity battery, one could understand a higher charging current. However, I seem to have uncovered some findings that the charging system on the S4 is a bit more complex than first thought!
This is a bit techy, and assumes you have a basic amount of electrical knowledge, ie voltages, currents, resistance etc., but I'll try and keep it as simple as possible for anyone who may not!
To explain: The mains charger supplied with the S4 is a model number ETA-U90UWE, rated 5V @ 2A. However, the phone will ONLY charge at full current (which as I have measured so far, depending on what the phone regulates it to, typically sits in the region of 1.2 - 1.5A), when using the supplied charger (or possibly one of equal or higher current rating, depending on how it's configured internally), AND the supplied usb cable, OR any other usb cable, provided its shielding (the metal outer surface of the connectors) is connected at BOTH ends of the cable. Use a cable that doesn't have this shielding, and the charge current drops, regardless of whether there is plenty of current available or not. Use a different charger with an unshielded cable and the current drops even more, again regardless of whether it can supply plenty more current.
My assumption on this, is possibly an effort by Samsung to avoid the scenario of sticking 1.5 amps down a flimsy cheapo cable, the wires of which will likely be too thin to carry it.
After doing some probing around with a meter, I have managed to find a slight difference with the charger itself, compared to a generic one. In a generic one, the two data pins are usually just shorted together, which tells most phones that it's a mains charger rather than a USB port. On the Samsung one on the other hand, the pins appear to be shorted together, and also connected via resistors across the supply line (known as a potential divider), which holds these shorted data pins at a certain voltage. This is what tells the phone what sort of charger it's connected to.
Attached are a couple of diagrams to show the difference between the two chargers. There are in fact various setups of resistors that different manufacturers use to set the charging current, so it's quite easy to run into compatibility issues!
To make this a little less confusing I have done some preliminary experimenting, and I set out my results here.
For the test, I used combinations of 4 different usb style mains chargers: an apple iPhone one rated at 1A, an iPad one rated 2.4A, an HTC 1A one, and the genuine S4 one. With these I used two cables - the supplied Samsung S4 one (which is shielded), and a cheap generic one (which isn't). I started by measured the charging current directly with a meter, by using a very short usb breakout lead I've made, enabling me to interrupt the 5V line. However, I soon noticed that the use of any extension cables, even shielded, can lessen the chance of maintaining a good shielding connection, so I continued the exercise relying on the "galaxy charging current" app to get a reading.
Charger.................... Cable................ Current (A)
=====================================
HTC 1A.................Generic..................... 0.5
HTC 1A.................Samsung S4..............1.0
Apple 1A...............Generic......................0.5
Apple 1A...............Samsung S4..............1.0
Apple 2.4A............Generic......................0.6
Apple 2.4A............Samsung S4..............1.3
Samsung S4.........Generic...................... 0.8
Samsung S4.........Samsung S4...............1.3
So as you can see from these results, the original charger makes a difference, and the supplied cable (or a good quality shielded one) makes a further difference. If you have any further findings please feel free to add them here.
I can see that this is going to confuse some people, as it has me, as I'm sure some will inevitably try charging up their phone on generic chargers/leads at some point, with potentially long charging times resulting!
.
This is a bit techy, and assumes you have a basic amount of electrical knowledge, ie voltages, currents, resistance etc., but I'll try and keep it as simple as possible for anyone who may not!
To explain: The mains charger supplied with the S4 is a model number ETA-U90UWE, rated 5V @ 2A. However, the phone will ONLY charge at full current (which as I have measured so far, depending on what the phone regulates it to, typically sits in the region of 1.2 - 1.5A), when using the supplied charger (or possibly one of equal or higher current rating, depending on how it's configured internally), AND the supplied usb cable, OR any other usb cable, provided its shielding (the metal outer surface of the connectors) is connected at BOTH ends of the cable. Use a cable that doesn't have this shielding, and the charge current drops, regardless of whether there is plenty of current available or not. Use a different charger with an unshielded cable and the current drops even more, again regardless of whether it can supply plenty more current.
My assumption on this, is possibly an effort by Samsung to avoid the scenario of sticking 1.5 amps down a flimsy cheapo cable, the wires of which will likely be too thin to carry it.
After doing some probing around with a meter, I have managed to find a slight difference with the charger itself, compared to a generic one. In a generic one, the two data pins are usually just shorted together, which tells most phones that it's a mains charger rather than a USB port. On the Samsung one on the other hand, the pins appear to be shorted together, and also connected via resistors across the supply line (known as a potential divider), which holds these shorted data pins at a certain voltage. This is what tells the phone what sort of charger it's connected to.
Attached are a couple of diagrams to show the difference between the two chargers. There are in fact various setups of resistors that different manufacturers use to set the charging current, so it's quite easy to run into compatibility issues!
To make this a little less confusing I have done some preliminary experimenting, and I set out my results here.
For the test, I used combinations of 4 different usb style mains chargers: an apple iPhone one rated at 1A, an iPad one rated 2.4A, an HTC 1A one, and the genuine S4 one. With these I used two cables - the supplied Samsung S4 one (which is shielded), and a cheap generic one (which isn't). I started by measured the charging current directly with a meter, by using a very short usb breakout lead I've made, enabling me to interrupt the 5V line. However, I soon noticed that the use of any extension cables, even shielded, can lessen the chance of maintaining a good shielding connection, so I continued the exercise relying on the "galaxy charging current" app to get a reading.
Charger.................... Cable................ Current (A)
=====================================
HTC 1A.................Generic..................... 0.5
HTC 1A.................Samsung S4..............1.0
Apple 1A...............Generic......................0.5
Apple 1A...............Samsung S4..............1.0
Apple 2.4A............Generic......................0.6
Apple 2.4A............Samsung S4..............1.3
Samsung S4.........Generic...................... 0.8
Samsung S4.........Samsung S4...............1.3
So as you can see from these results, the original charger makes a difference, and the supplied cable (or a good quality shielded one) makes a further difference. If you have any further findings please feel free to add them here.
I can see that this is going to confuse some people, as it has me, as I'm sure some will inevitably try charging up their phone on generic chargers/leads at some point, with potentially long charging times resulting!
.
Attachments
Last edited: