NIST maintains primary frequency standards for the United States, including the cesium fountain frequency standard NIST-F1, which operates with an uncertainty of less than 1 part in 10^15. The AT1 time scale is developed using an ensemble of cesium standards and hydrogen masers, operating as a real-time free-running scale with optimized clock weights.
UTC(NIST) is generated as an offset from the AT1 scale and steered towards UTC using data from the BIPM’s Circular T. The steering process typically occurs at 0000 UTC on the first day of each month, with frequency changes limited to ±2 nanoseconds per day. UTC(NIST) parameters are based on extrapolated UTC data from the most recent available information.
The relationship between UTC(NIST) and AT1 is quantified by the leap second count (xls), time offset (x), and frequency offset (y). The equation UTC(NIST) – AT1 = xls + x + y(T – To) is used to calculate the difference between the two scales at any given time T expressed as a Modified Julian Day.
Leap seconds are only applied to UTC(NIST) and UTC, but not to the AT1 scale. Monthly time scale parameters and rate changes are presented in a table, with columns for T0 values, xls, x, y coefficients, start and end ranges for each parameter set.
References are provided for further reading on topics such as GPS time transfer, calibration procedures, and accuracy evaluations of primary frequency standards.
Keywords: NIST, frequency standards, time scales