John Cowan wrote on 2002-11-28 19:54 UTC:
> *sigh*
>
> Almost all embedded clocks, including the ones on our walls and wrists,
> display TAI-style time (60 seconds per minute) but are set using LCT,
> which is defined in terms of UTC. This fundamental disconnect is not
> going to go away unless UTC is redefined (or unless the LCT-UTC link is broken).
*sigh*
Let's get a little reality check:
Most embedded clocks (state of the art is still a calibrated 32
kibihertz crystal) have a frequency error of at least 10^-5 (10 ppm),
and therefore drift away from the TAI rate faster than 1 second per
week. These clocks are useless to discover whether there has been a leap
second even between two weekly adjustments to LCT! There seems no
technology around the corner for less than a few thousand dollars per
oscillator that can actually distinguish between TAI and UTC based on
even a monthly comparison. Even the drift of good
temperature-compensated crystal oscillators is orders of magnitude worse
than the difference between TAI and UTC. [If you know of any affordable
portable oscillator technology with a frequency accuracy better than
the TAI-UTC drift rate of ~2*10^-8, I am sure the designers of handheld
GPS locators would be extremely eager to hear from you, and so would I!]
When you say "TAI-style", you really speak of some idealized clock with
properties close to what you find on the market only in very expensive
(>>10^4 US$) clocks from the Agilent catalog, not of anything that you
might put onto a wall, wrist, cockpit, telecoms switch or workstation
motherboard! Only a few thousand labs and astronomical stations actually
have independent clocks of the accuracy that you postulate, and these
lab clocks need to be fed, watered and cleaned with tender loving care.
Every other clock that is not remotely controlled by a UTC broadcast
service gets anyway adjusted manually every few megaseconds
(or more) by a few tens of leap seconds (or more).
So what? We are really talking here *only* about the needs of users of
synchronized clocks that are controlled in phase and/or frequency
automatically relative to a reference broadcast time signal. With a bit
of proper engineering, the current TAI-UTC difference and any
planned changes to it could easily be communicated along the same
communication channels that are used by the control loop that keeps the
clock from racing away from UTC with more than a second per week.
Very few of these applications (e.g., space navigation) really need to
make exact long-term time interval measurements, and those few that do
can subtract TAI timestamps. Most applications need just phase
synchronization of some other oscillator (e.g., some synchronous digital
communications systems), or need to agree worldwide on unique timestamps
for events (some real-time trading systems, logging in timecritical
distributed transaction and filing systems, distributed debugging,
etc.). All these can already do quite fine with UTC, except for the
issue of needing a special 60.xxx notation for inserted leap seconds.
UTS fixes that elegantly. Another large set of applications needs to
make time-interval measurements, but can easily handle an error of 0.1%,
especially if this frequency error happens everywhere simultaneously.
There is existing practice for such timescales (e.g., the BSD Unix
adjtime() mechanism), but what is lacking is a formal standard to
sanction one as preferred practice. These applications is what UTS is
for, too.
I see absolutely no practical engineering need to disconnect civilian
time forever from the rotation of the earth. So why then should we break
with over 300 million years of established civilian timekeeping practice
on this planet? Just to allow your local caesium fountain sales
representative to claim that his product won't need any adjustments for
a few hundred millenia to show LCT? DST means she's be wrong anyway.
Just to save a sloppy manufacturer of a GLONASS receiver from fixing a
leapsecond-related bug in their firmware? I'm not convinced ...
What ought to be done:
- Keep UTC essentially as it is.
- Define something like UTS as an international standard way
to navigate around the out-of-range notation needed for timestamps
during leap seconds, for use in applications (e.g. computer
operating systems) where real UTC leap seconds cause unnecessary
complications but a 0.1% frequency error for 1000 seconds before
a leap second is perfectly tolerable.
- Make UTC-TAI (current and planned changes) an integral part of all
UTC broadcast services, for use in the few applications that need
to measure time intervals more accurately than the typical 10^-5
frequency accuracy of normal clocks (space navigation, radio
astronomy, some geophysical observation techniques, etc.).
What engineering problem is not solved satisfactorily by these measures?
Markus
--
Markus G. Kuhn, Computer Laboratory, University of Cambridge, UK
Email: mkuhn at acm.org, WWW: <http://www.cl.cam.ac.uk/~mgk25/>
Received on Thu Nov 28 2002 - 14:33:16 PST