The article discusses the NIST Time Scale Data Archive, which provides accurate time and frequency standards for various applications. NIST operates two primary frequency standards, NIST-F1 and NIST-F2, which serve as the U.S. primary standards of time and frequency since 1999. The uncertainty of NIST-F2 is about 1 part in 10^16.
The article explains how NIST generates UTC(NIST) in real-time using a post-processed time-scale algorithm based on data from an ensemble of cesium standards and hydrogen masers. UTC(NIST) is generated as an offset from NIST’s real-time scale AT1, with changes in frequency limited to ±2.3 x 10^-14. The parameters used to define UTC(NIST) with respect to AT1 are listed in a table, along with the equation to find the desired value.
The accuracy of the time and frequency standards is evaluated in various publications, including those by Allan, Davis, Weiss, Jefferts, Shirley, Lewandowski, and Weiss. The article concludes by summarizing the key points and providing a brief overview of the accuracy evaluation of the primary frequency standard NIST-7.
Source: https://www.nist.gov/pml/time-and-frequency-division/time-services/nist-time-scale-data-archive
Keywords: frequency, cesium, accuracy