The device doesn't request. The Device "looks at" the DC voltages on D+ and D- (the data pins) and decides how much it can take. The current rating marked on a device is maximum it will ever take.
Apple devices and gadgets that copy them use different settings for higher current.
With 500mA devices as long as the supply is 4.8V to 5.2V and can supply at least 500mA, then it's fine, though some combinations of charger and device may result in the device only taking 100mA (the original USB 1.0 spec).
Some USB 2.0 devices assume at least 500mA is available, but even if rated 2A (some tablets) will only take more than 500mA (1A, 1.5A or 2A) if the D- and D+ are having compatible DC volts/ resistors on them.
Basically if a device is 500mA, it should be fine and take up to 500mA on nearly anything, but 1A, 1.5A and 2A product need compatible hosts or chargers or they will either not work at all (DVD drives, HDDs) or only take up to 500mA.
So a 1A, 1.5A or 2A supply is fine for a Kindle.
The USB-C uses a separate PSU data channel, rather than fixed resistors or voltages on most USB-A ports (host or chargers, USB1.x, USB2.0 or USB3.0 devices). In the case of USB-C the initial default is 5V and the device does signal on the power data connection what voltage (up to 20V). Obviously not all USB-C PSUs or host ports can do the full 20V 5A, = 100W, or the chargers would all be large and expensive and a laptop would need a more powerful PSU that would ever be available. Laptop host ports might be 15W maximum shared to all the ports. USB-C also can have Display Port, a sort of limited version of HDMI. The USB-C tries to do everything in one connector, shouldn't be called USB.
Last edited by Quoth; 05-24-2020 at 10:13 AM.
|