Quote:
Originally Posted by DNSB
I seem to remember thinking that looking up decimal to binary in a book and entering the binary numbers into the computer then converting the computer's light display back to decimal was a bit silly. But then in 1952-3 when the book was written, it would be quite possible the computer would not have had enough memory to store the lookup tables. Grasping at straws here...
|
Maybe it's my background, but it's the idea of
looking up the conversions in a book that strikes me as silly. I do most one-byte conversions in my head, two-byte values might take me a little longer, and if I can't do that for some reason (large values, or maybe not enough sleep), a pencil and a scrap of paper do just dandy.
I mean, seriously - take 10101001. Split it into four-digit pieces to make things easy: 1010 and 1001. Anyone who knows what binary
is should be able to read those as "ten" and "nine." Getting that to decimal is simple: "ten times sixteen plus nine" equals 169. Going to hex is even easier: ten is A, nine is 9, write 'em next to each other, that's A9. (There's a reason we use four-bit chunks!) Works almost as easily in reverse.
Or, for a more timely comparison that RAH should have been able to figure out, if a WWII ship's radio operator can transcribe Morse code by ear because that's his job, his Space Patrol counterpart who works with binary-decimal-hex all day long will be just at good at translating
that in
his head. Even better, probably; Morse code relies on remembering a table, whereas this is grade-school multiplication.