I've tested with d-/d+ shorted and it's not working.
@MizGarfield if u have 1 usb extension cable u can cut it on half.
Tie together white and green wires.
Tie black wires and conected to that 1 end of 10kohm resistor, the other end tie it to green/white wire.
Same to red wires but use 33kohm resistor.
See att. Sry for drawing.
You don't need that for modern devices has i told you. It did not work because you did it wrong. I have all my usb chargers working the new way, you even have wikipedia talking about it. You have to cut the data + and - on the power supply from the female usb port and short only the female d+ and d-. Trust me it works and it is alot simpler.
Edit : from wikipedia "The Dedicated Charging Port shorts the D+ and D- pins with a resistance of at most 200 Ω. The short disables data transfer, but allows devices to detect the Dedicated Charging Port and allows very simple, high current chargers to be manufactured. The increased current (faster, 9 W charging) will occur once both the host/hub and devices support the new charging specification."
""As of June 14, 2007, all new mobile phones applying for a license in China are required to use the USB port as a power port.[35][36] This was the first standard to use the convention of shorting D+ and D-.[37]""
http://apple.slashdot.org/story/10/08/03/1743240/Hardware-Hackers-Reveal-Apples-Charger-Secrets
""We all love to call out Apple when they design deliberate incompatibility into their devices, but there is a perfectly valid technical reason for what Apple is doing here, and, in fact, they are following a USB specification (which LadyAda unfortunaterly didn't even test).
Without data communications or when suspended, devices may legally draw no more than 2.5mA from a host, which is useless for charging. In fact, even if you're generous and pretend they're connected, devices are not allowed to draw more than 100mA without negotiating for a higher current, which requires actually talking to the host, and 100mA is still too little to charge properly. 500mA is the maximum allowed by the USB spec, but devices must negotiate it (there may be too many devices on the bus for negotiation to succeed).
Before there was a spec for "dumb" USB chargers, Apple used the resistors as a sentinel to avoid drawing too much current from undersized chargers in order to avoid damaging the host. This is a hack, but it works, and honestly, we're smart enough to figure out a couple resistors on the data lines. It's not like they're using crypto auth on the charger. They have a perfectly valid reason to do this. Devices which charge from "dumb" chargers aren't following the spec, though this is a common industry practice.
As it turns out, the USB-IF came up with a USB Battery Charging spec [usb.org]. The spec is long and boring, but it boils down to: short together the data lines (no resistors required) and you indicate that you're a dumb charger that can supply anywhere from 0.5A to 1.5A.
Guess what happens when you short the data lines of an iPhone 3G and supply 5V [marcansoft.com]. Did Apple just follow a standard? Incredible!
(Yes, I'm not following the USB spec there by in turn using a USB cable to supply the 5V and not negotiating over its data lines. I didn't feel like grabbing a dedicated 5V PSU for the shot, so sue me.)"""
http://marcansoft.com/transf/iphonechg.jpg
ok ???
no need for resistors, only 5.3V MAX and d+ and d- shorted