Quote:
Originally Posted by Rev. Bob
Maybe it's my background, but it's the idea of looking up the conversions in a book that strikes me as silly. I do most one-byte conversions in my head, two-byte values might take me a little longer, and if I can't do that for some reason (large values, or maybe not enough sleep), a pencil and a scrap of paper do just dandy.
I mean, seriously - take 10101001. Split it into four-digit pieces to make things easy: 1010 and 1001. Anyone who knows what binary is should be able to read those as "ten" and "nine." Getting that to decimal is simple: "ten times sixteen plus nine" equals 169. Going to hex is even easier: ten is A, nine is 9, write 'em next to each other, that's A9. (There's a reason we use four-bit chunks!) Works almost as easily in reverse.
Or, for a more timely comparison that RAH should have been able to figure out, if a WWII ship's radio operator can transcribe Morse code by ear because that's his job, his Space Patrol counterpart who works with binary-decimal-hex all day long will be just at good at translating that in his head. Even better, probably; Morse code relies on remembering a table, whereas this is grade-school multiplication.
|
Early Univac computers used octal notation in the manuals, 3 bit chunks.