The rollout of 5G presents network architects and product designers with many new challenges. One of them is ensuring the network meets stringent standards for time synchronisation; essential for delivering the high data transfer rates and ultra-low latency that are key features of 5G.
If individual nodes (in 5G small-cell base stations) start to drift away from Co-ordinated Universal Time (UTC), it can damage quality of experience (QoE) and quality of services (QoS), producing unwanted results like glitches in streaming video and delayed responses in connected vehicles.
5G networks will rely on GNSS signals for ultra-precise UTC synchronisation
Most networks obtain UTC from a global navigation satellite system (GNSS) by means of one or more primary reference time clocks (PRTCs), or grandmaster clocks. These clocks obtain the time signal via an antenna and GNSS module, hold it in an oscillator, and distribute it using a protocol like PTP.
For 5G to deliver on its promise of high quality ultra-low latency data transfer, PRTCs and enhanced primary reference time clocks (ePRTCs) will need to keep the network synced ever closer to UTC. The recommendation from the International Telecommunications Union (ITU) is just 30 nanoseconds (ns) from UTC.
Receivers are vulnerable to multipath effects
To meet the standard, the PRTC’s antenna must be able to receive signals from at least four satellites. In 5G networks, however, antennas must be placed near to small-cell base stations—often at or near street level.
That creates problems in urban canyons, where there tends not to be four line-of-sight signals available. Instead, signals tend to reflect, refract or diffract off surrounding tall buildings, as well as the ground. These multipath effects elongate the signal’s path to the antenna, potentially causing the receiver—which measures the distance the signal has travelled to calculate precise time—to output an inaccurate timestamp.
5G timing receiver developers must mitigate for multipath effects
Emerging practice for 5G networks is to include a GNSS clock in every small-cell base station, making the new network an attractive market for developers of GNSS timing receivers.
But to be 5G-ready, the receivers must be able to synchronise time to the standard set by the ITU. For developers, that means ensuring the receiver can always mitigate for multipath effects in order to calculate precise time.
Test challenges for 5G timing receivers
Timing receiver developers are hard at work on new multipath mitigation technologies for 5G, using techniques like dual antennas and advanced multipath mitigation algorithms. But testing the performance of their solutions presents challenges.
Every real-world antenna location has different characteristics, meaning that a receiver that works well in one place may not work well in another. Some antennas will have a clear view of the sky and only minimal multipath problems, while others will have a severely restricted sky view and high levels of multipath. Some will be located in a stable environment, while others will be placed in an environment that changes as traffic passes and new buildings are constructed around it.
To verify that the receiver can meet the ITU standard under all conditions, developers must test it in a wide range of locations. But that creates problems; in order to test the receiver’s output against UTC, the testers must also have a reference source of precise UTC to measure the results against. As that source is usually GPS or another satellite constellation, the reference source is also vulnerable to multipath in the test location.
Simply put: testing GNSS multipath mitigation technologies in real-world urban locations is almost impossible for timing receivers, because the reference source of UTC may also be affected by multipath. If the control is contaminated, the test becomes unreliable.
Realistic simulation of multipath effects with Spirent Sim3D
In the past, developers have got around this problem by simulating the effects of multipath in the lab, using GNSS signal simulators with built-in multipath effects turned on. The reference time source is usually a GNSS antenna placed on the roof with a clear view of the sky and a stable output.
The issue with this approach is that, typically, multipath effects generated by generic GNSS simulator software aren’t realistic. They’re generated using a statistical model, and can’t recreate a location’s unique characteristics; like the height and distribution of the buildings, the physical materials they’re made of, or the presence of trees, pedestrians and traffic.
While these effects can provide useful insight into how the receiver handles multipath, they can’t provide fine-grained performance data to verify that the receiver will retain ultra-precise synchronisation in every location where it might reasonably be installed.
To do that, testers need a much more realistic way to simulate multipath effects in different locations; a capability that Spirent offers with our. With Sim3D, multipath effects are accurately and richly modelled in real time, so receiver performance can be realistically evaluated.
Furuno and NTT use Spirent Sim3D for realistic multipath simulation
One of the first developers to bring a 5G-compliant timing receiver to market is Japan’s Furuno Electric, which has worked with NTT to develop a multipath mitigation algorithm capable of remaining within a tight margin of error.
To test the algorithm in realistic multipath scenarios, Furuno and NTT used a Spirent GNSS signal simulator with Spirent Sim3D. With the Spirent solution exhaustively verified as realistically simulating multipath effects in a range of urban environments, the team could reliably assess the real-world performance of the algorithm—and were able to assure 5G compliance even in multipath-rich locations.
Read the full case study
You can learn more about how NTT and Furuno solved the problem of realistic multipath simulation in the lab by downloading the full case study:.