Ipad charger on Atrix?

primeboss

Senior Member
Jun 1, 2008
214
7
0
The Atrix's default adapter in an 5v = .85A, while the Ipad's is 5v = 2.1A. Is it safe for the battery to use this charger? I also have been using the ipod charger on my atrix too, should i countinue to use the ipod charger or does that have negitive effects too, ipod charger is 5v = 1A

Atrix: 5v = .85A
Ipod/Iphone: 5v = 1A
Ipad: 5v = 2.1A
 

ian426

Senior Member
Jun 6, 2010
112
23
0
Typically, a device will only pull what it needs, amperage-wise. The ratings on power supplies are, to my knowledge, always indications of maximum amperage, not any form of 'forced' current. Thus, the only time you need to be worried is if it is lower than your device's required input. You should be fine with either.
 

live4nyy

Senior Member
Mar 5, 2011
1,898
408
0
+1

That's correct. I actually spent a lot time researching that kind of stuff because I use electronic cigarettes and finding chargers for them was difficult. Anyways, as long as it's 5V it should be fine. They actually make AC adapters that are iPad "compatible", meaning that they are just rated at 2.1A but it still works with the iPhone which the OP has stated uses a lower Amperage.

Typically, a device will only pull what it needs, amperage-wise. The ratings on power supplies are, to my knowledge, always indications of maximum amperage, not any form of 'forced' current. Thus, the only time you need to be worried is if it is lower than your device's required input. You should be fine with either.
 

hlywine

Member
Feb 10, 2009
18
1
0
the fact that a device will pull as much as it needs is true, but that is true only to the devices, appliances, and anything that is using the electricity, not storing it - which is the case with the battery. any electrical device uses only as much power as it needs. for example: a 55watt house light bulb will only use .5 amps, (110 volts AC) even though the circuit is wired for 15 amps max.
When it comes to cellphones, the cellphone is the device that uses the power and the battery stores the power. during charging, battery will try to pull as much in as you will give it, unless there is a limiting factor involved. a limiting factor can be a charger it self, which will supply 1.0 amps, .85 amp max, or what ever the case may be. also there may be a limiting factor built in to the phones circuitry it self that would allow only so much to go through ( i seriously doubt though)
By plugging in to 2.1A charger, the battery will try to intake all 2.1 amps,
Pro: you are charging the battery in half the time.
Con: if it doesn't destroy the battery right away, the lifespan of it and usefulness decreases dramatically.
This is called overcharging the battery, do some research on that and you will find out that overcharging the battery is never a good thing.
2.1A is not enough to destroy the battery right away, but if you would have plugged in 5 or 10 amp charger, it probably would, i'm just saying this to explain the concept.
I personally do use a 1.0A charger that i have left over from previous cellphone (touch pro 2) and your ipod charger should be ok too, but I wouldn't use anything bigger then that.
a small experiment that you can conduct which may or may not work. compare the temperatures of the battery/cellphone while it is charging on .85amp charger and 2.1amp, when it is on a bigger charger, it should get a lot hotter, and that is what destroys the battery.
As far as my knowledge goes, i have taken enough classes about electricity and electronics, and have been working in the field for several years, so i hope i was helpful enough and explained it in simple enough terms for everyone.
 

ian426

Senior Member
Jun 6, 2010
112
23
0
the fact that a device will pull as much as it needs is true, but that is true only to the devices, appliances, and anything that is using the electricity, not storing it - which is the case with the battery. any electrical device uses only as much power as it needs. for example: a 55watt house light bulb will only use .5 amps, (110 volts AC) even though the circuit is wired for 15 amps max.
When it comes to cellphones, the cellphone is the device that uses the power and the battery stores the power. during charging, battery will try to pull as much in as you will give it, unless there is a limiting factor involved. a limiting factor can be a charger it self, which will supply 1.0 amps, .85 amp max, or what ever the case may be. also there may be a limiting factor built in to the phones circuitry it self that would allow only so much to go through ( i seriously doubt though)
By plugging in to 2.1A charger, the battery will try to intake all 2.1 amps,
Pro: you are charging the battery in half the time.
Con: if it doesn't destroy the battery right away, the lifespan of it and usefulness decreases dramatically.
This is called overcharging the battery, do some research on that and you will find out that overcharging the battery is never a good thing.
2.1A is not enough to destroy the battery right away, but if you would have plugged in 5 or 10 amp charger, it probably would, i'm just saying this to explain the concept.
I personally do use a 1.0A charger that i have left over from previous cellphone (touch pro 2) and your ipod charger should be ok too, but I wouldn't use anything bigger then that.
a small experiment that you can conduct which may or may not work. compare the temperatures of the battery/cellphone while it is charging on .85amp charger and 2.1amp, when it is on a bigger charger, it should get a lot hotter, and that is what destroys the battery.
As far as my knowledge goes, i have taken enough classes about electricity and electronics, and have been working in the field for several years, so i hope i was helpful enough and explained it in simple enough terms for everyone.
I might have to double check that. There is a chance that there is some sort of limiting circuit between the wall and the charger for the Atrix... I am fairly certain at least laptops do so. I will see if I have a stronger charger and I will check the voltage across the leads in the atrix... if I can.
 

hlywine

Member
Feb 10, 2009
18
1
0
Its not voltage that you should be checking, voltage should be same in all usb chargers, about 5 volts, you should be checking amps
 

ian426

Senior Member
Jun 6, 2010
112
23
0
Its not voltage that you should be checking, voltage should be same in all usb chargers, about 5 volts, you should be checking amps
My mistake... realized that after I posted it.

Also -- I do not have any USB charger that is over one amp, so I cannot check this. If anyone has a mutineer and a more powerful charger, they could do so.
 
Last edited:

live4nyy

Senior Member
Mar 5, 2011
1,898
408
0
The important factor is the Voltage which is at 5V for both the iPad and Atrix chargers. Whether it is rated at 10W or 5W does not matter because that just reflects the capacity for the current. And the charger is "rated" at 2.1 meaning it can handle that current rather than meaning it will force it. The "draw" of current is decided by the phone itself, as long as the Voltage is identical, the other factors should not matter.

If you read the "Summary" here it will say that, with the iPad charger, you can charge an iPhone which is similar to the Atrix in charging specs:
http://support.apple.com/kb/HT4327

And here are a couple more links:
http://www.youtube.com/watch?v=-ZjRm8nkv9Q

http://munnecke.com/blog/?p=836

the fact that a device will pull as much as it needs is true, but that is true only to the devices, appliances, and anything that is using the electricity, not storing it - which is the case with the battery. any electrical device uses only as much power as it needs. for example: a 55watt house light bulb will only use .5 amps, (110 volts AC) even though the circuit is wired for 15 amps max.
When it comes to cellphones, the cellphone is the device that uses the power and the battery stores the power. during charging, battery will try to pull as much in as you will give it, unless there is a limiting factor involved. a limiting factor can be a charger it self, which will supply 1.0 amps, .85 amp max, or what ever the case may be. also there may be a limiting factor built in to the phones circuitry it self that would allow only so much to go through ( i seriously doubt though)
By plugging in to 2.1A charger, the battery will try to intake all 2.1 amps,
Pro: you are charging the battery in half the time.
Con: if it doesn't destroy the battery right away, the lifespan of it and usefulness decreases dramatically.
This is called overcharging the battery, do some research on that and you will find out that overcharging the battery is never a good thing.
2.1A is not enough to destroy the battery right away, but if you would have plugged in 5 or 10 amp charger, it probably would, i'm just saying this to explain the concept.
I personally do use a 1.0A charger that i have left over from previous cellphone (touch pro 2) and your ipod charger should be ok too, but I wouldn't use anything bigger then that.
a small experiment that you can conduct which may or may not work. compare the temperatures of the battery/cellphone while it is charging on .85amp charger and 2.1amp, when it is on a bigger charger, it should get a lot hotter, and that is what destroys the battery.
As far as my knowledge goes, i have taken enough classes about electricity and electronics, and have been working in the field for several years, so i hope i was helpful enough and explained it in simple enough terms for everyone.
 

hlywine

Member
Feb 10, 2009
18
1
0
thanks live4nyy, i never saw those before. with all the stuff described there, the only conclusion possible is that each device has its built in limiter on how much it will pull while charging, or apple figures that with a bigger charger your battery on ipod/iphone will still last you past the 1 year manufacturers warranty expiration date, but barely past that date, instead of lasting 3-5 years like its suppose to. what ever the case is with apple, i just hope we have a safety built in into our atrix phones. I guess the only way to find out is to actually check the amperage while its charging.
 

live4nyy

Senior Member
Mar 5, 2011
1,898
408
0
I'm almost positive that the lithium batteries in phones these days are "rated" for specific current and have built in circuits that dictates the "flow", which is also the same thing that causes the battery to go into a "trickle" charge when near capacity. Just for that there has to be some sort of "regulation" happening. See also here:

http://science.howstuffworks.com/environmental/energy/question501.htm

But I agree, better safe than sorry. If you happen to have an iPad charger that you plan on using let me know how it goes. I'm curious as well. :)


thanks live4nyy, i never saw those before. with all the stuff described there, the only conclusion possible is that each device has its built in limiter on how much it will pull while charging, or apple figures that with a bigger charger your battery on ipod/iphone will still last you past the 1 year manufacturers warranty expiration date, but barely past that date, instead of lasting 3-5 years like its suppose to. what ever the case is with apple, i just hope we have a safety built in into our atrix phones. I guess the only way to find out is to actually check the amperage while its charging.