I was just having a conversation on chargers with a friend of mine the other day and googled around on the subject on amperage/current/Ma of chargers..
I stumbled upon these posts (either here on XDA, Androidforums or other forums):
1. "The current rating on a voltage source is the maximum amount that the power
source can deliver without exceeding its saftey rating.
What this means is that if you are using some device that has a power supply
with a current rating of 500mA then its best not to use a different power
supply(at the same votlage rating) with a lower max current rating. i.e.,
anything < 500mA. Now ofcourse you might be able to get away with it but if
it burns down your house then its your fault.
A device will only pull the amount of current that it uses(assuming it is a
voltage controlled device) and this is true regardless of the current
rating(hence the saftey issues I discussed above). If a device
says(sometimes they don't) it uses 500mA then it uses 500mA. Maybe it
doesn't use 500mA all the time but the engineers have put that rating there
for a reason. Using any power supply with the right voltage and a current
rating of anything more than what the device uses is ok because the device
will only pull the current it uses.
Now, about the voltage rating: The voltage rating does not have to be exact
and different devices can tolerate different voltage ratings. The problem
usually is one of current. By increasing the voltage, say, you increase the
current the device uses and then you have changed the parameters that the
device was created with."
2. "And as far plugging your phone into a charger that outputs well over 850mA, don't worry about that either. Unlike voltage, the more amperage the merrier because the device will only take what it needs of the available resources."
3. "Moral of the story. Match the Voltage (5.1Volts) Meet or Exceed the 850mA rating. (which is .850 Amps) and you'll be fine."
4. "amps are not pushed but drawn
amps is the max the charger can provide
before it get pressured and lover the volts
you could use a 5volt 10000MegaAmp charger
and the device would only draw the amps the device
was made to draw all the rest of the amps would stay
at your electricity company
ohms law state Amps == volts / residence"
5. "amps are not pushed but drawn
ohms law state Amps == volts / residence
In other English:
P = VI, where
P = Power of device (watts) and is fixed
V = Voltage used by device (volts) and is fixed
I = Current (amps) and is decided by P/V (a fixed ratio)
So the device cannot draw more current than the fixed ratio. It may draw less current if the charger cannot supply the highest amount, but then as in one of the above posts, it simply takes longer to recharge.
With these devices, milliWatts/miliAmps are the scale, 5V is generally the fixed Potential Difference.
Used in a vehicle, the device is generally both drawing and expending energy (ie. charging and running say, GPS) simultaneously. This in/out situation when prolonged is the cause of the observed overheating with the original X1 battery."
Bottom line... Make sure the voltage is 5V, the amperage doesn't really matter.