Quote:
Originally Posted by ProDigit
I've never owned or even seen a computer being off by more than 5 minutes in a year!
|
Most likely you've owned Internet-connected computers. Windows ever since Windows XP automatically goes out to time servers on the Internet to set its clock, or to a domain server that does so anyhow. Those time servers on the Internet in turn get their time from the atomic clocks run by the National Institute of Standards. I, on the other hand, work with very large server computers that are not allowed to talk to the Internet (as in, $20K and up server computers that either serve large amounts of data to isolated networks or provide virtual machines to isolated networks). I noticed quickly that everything was going out of sync with great rapidity, which impacts Kerberos (a large network authentication protocol) and other such time-sensitive protocols very severely. I ended up having to set up a time server on a bastion host that set its time from the Internet then have everybody else get their time from that bastion host because the clocks on these servers were drifting by about 30 seconds per week.
Point: Computer clocks suck because computer vendors figure you'll just get your time from the Internet.
(And as someone else pointed out, temperature is a big issue there too... my clocks get slower during the winter, when the average temperature of my server room declines to around 60F from its summer average of about 75F).
So anyhow, I'm not upset about the Oasis clock being inaccurate. As long as it auto-syncs its time with Amazon's servers at the same time that it syncs everything else I'm happy.