It’s been reported that in 2020 every human on the planet created 1.7 megabytes of information every second. This continued growth in data generation, driven by applications providing video streaming, social media, online gaming and financial trading at the fingertips of users worldwide, is creating a data challenge for providers.
In order to deliver lower latency and improve the user experience, providers are decentralizing their datacenters, moving data and computation to the network edge and closer to the customer. However this creates a problem in itself because despite it addressing the latency issue, decentralization means that data is now being generated in multiple locations at high speeds and therefore can become stale. This is a particular issue for applications delivering high frequency financial trading or commercial transactions who are affected by standards and regulations requiring precise and accurate timestamps as financial records.
So to maintain data integrity, increasingly accurate time synchronization is required. Insufficient accurate time synchronization can result in the provider having to do bulk data transfers, considerably increasing the number of servers required to do this and consequently driving up their costs.
Time synchronization is not new in datacenters but the required level of accuracy providers are seeking is changing with 100s or 10s of microseconds, or sometimes even tighter often required. In order to achieve this, providers are looking at the protocols used for time synchronization as a solution.
NTP has been traditionally used, primarily designed for synchronization over the internet using unicast, where it can usually achieve accuracy in the single-digit millisecond range. If implemented in chrony and NTP in local networks, and in ideal conditions, it can get within tens of microseconds.
And the newer PTP is also now being used. Designed for local networks with broadcast/multicast transmission, again in ideal conditions, the system clock can be synchronized with sub-microsecond accuracy to the reference time.
However the obvious commonality with both protocols is that the conditions in which they operate can affect the level of accuracy they can deliver. So as more datacenters are moved closer to users, into more remote locations and into more wide ranging conditions, synchronization testing in the field will be the only way to get assurance of accuracy and to reap the benefits that decentralized data can deliver.