View Single Post
Old 11-24-2020, 05:19 PM   #4
DoctorM
Groupie
DoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and graceDoctorM herds cats with both ease and grace
 
Posts: 168
Karma: 43000
Join Date: Mar 2010
Device: none
Years ago I had someone at Radio Shack (if you remember those) tell me the opposite. That an adapter's amperage was more important than voltage. For example a device that runs on 8 D-batteries, which should be 12V can run just fine on 10V if the amperage is high enough. Maybe lower voltage is fine since batteries lose voltage over time, but higher is dangerous?

What is troubling is that the adapter has a list something like this:
5V : 3A
7.5V : 2A
15V : 1A

From a wattage standpoint that makes sense since Power (Watts) = Voltage x Current.
But does that mean a reader drawing 1A is getting 15V? That seems like an issue.
DoctorM is offline   Reply With Quote