Years ago I had someone at Radio Shack (if you remember those) tell me the opposite. That an adapter's amperage was more important than voltage. For example a device that runs on 8 D-batteries, which should be 12V can run just fine on 10V if the amperage is high enough. Maybe lower voltage is fine since batteries lose voltage over time, but higher is dangerous?
What is troubling is that the adapter has a list something like this:
5V : 3A
7.5V : 2A
15V : 1A
From a wattage standpoint that makes sense since Power (Watts) = Voltage x Current.
But does that mean a reader drawing 1A is getting 15V? That seems like an issue.
|