It is claimed that two caesium clocks if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of the standard caesium clock in measuring a time interval of 1 s?

100 years=100×365×24×60×60=3.154×109 s.
Given, difference between the two clocks after 100 years=0.02 s
Time difference in 1 s=0.023.15×109=6.35×10-12 s.
So, accuracy in measuring a time interval of 1 s=16.35×10-12 
=1.57×1011=1011 Approx.
Therefore, accuracy of 1 part in 1011 to 1012