Without a human at the wheel, how does a self-driving car know where it is – and what to do?
In highly-automated and autonomous vehicles, the job of reading the road and taking tactical driving decisions falls to the Advanced Driver-Assistance System (ADAS) or in-vehicle control system: an advanced software application that takes inputs from a variety of sensors, synthesises them into an understanding of the vehicle’s environment, and acts on that understanding in real time.
Vital to this is the vehicle’s understanding of its exact position. To achieve that, a connected autonomous vehicle relies on data from a battery of sensors, which will include some or all of the following:
GPS/GNSS: To determine the vehicle’s absolute position on the Earth’s surface, to determine the exact timing of actions and communications, and to aid navigation decisions.
Lidar, radar, sonar, computer vision: To detect and characterise objects in the environment, and to read signs, to inform decisions about braking, lane-keeping, lane-changing and collision avoidance.
Inertial measurement units (accelerometers, gyroscopes): To measure speed, movement and orientation, and to help determine current position using dead reckoning calculations from previous known positions.
WiFi and cellular (5G, LTE): To communicate with infrastructure and other connected vehicles, and to help to determine position, particularly in places where GNSS is unavailable – like tunnels and underground car parks
RTK/PPP: To increase the accuracy of a GPS/GNSS-derived position to sub-centimetre level, for precision positioning applications like lane keeping.
Combining these inputs to guide automated driving decisions is a complex feat of real-time data gathering, synthesis and processing – and it has to work. The consequences of a poor or mis-timed driving decision could be catastrophic, not just to the vehicle and any occupants, but also to other road users.
Sensors, sensor fusion algorithms and automated driving functions must be rigorously tested
Before they can share the road with human drivers, connected autonomous vehicles must be thoroughly and rigorously tested. Manufacturers must understand how the vehicle will behave in a huge range of conditions and scenarios; the sort it might encounter over millions of miles of driving.
System designers, OEMs and suppliers need to test the efficacity of the vehicle’s sensors, both individually and in concert. They need to know that each individual sensor is optimally placed in the vehicle, that it’s outputting accurate and reliable data, and that the sensor fusion algorithm that combines the data to create a complete picture of the environment is doing so effectively.
They also need to understand how the sensors behave – and how the sensor fusion algorithm responds – in a huge range of driving conditions, including when interference is present, or the vehicle is being subjected to a deliberate cyberattack.
Real-world testing isn’t sufficient: simulation is essential
The scale and rigour required means testing can’t be purely conducted in the real world. There’s little scope for driving a test vehicle along the hundreds of millions of varied miles required to verify safe and reliable performance – at least not in the early stages of development. And with OEMs vying to put autonomous cars on the roads as early as the mid-2020s, it would simply take too long.
What’s more, unusual edge case scenarios (like the bus that didn’t give way to one of Google’s self-driving cars, resulting in an unexpected collision) might not present themselves during real-world test drives. And even if they did, it would be hard to re-create them perfectly in order to test any updates to the control system.
That’s why manufacturers are turning to lab-based simulation with both physical and virtual vehicle models. It’s the only way to rack up millions (in some cases, billions) of realistic driving miles in a way that’s fast, scalable, reliable, repeatable, ethical and economically viable.
Four key aspects of PNT testing in connected autonomous vehicles
Creating a sufficiently realistic version of the world means simulating not just the road, but the entire driving environment – including other road users, obstacles in the road, the surrounding 3D landscape (like hillsides, trees and buildings), the weather, and activity in the RF spectrum.
As the market leader in testing position, navigation and timing capabilities, Spirent is working closely with the connected autonomous vehicle industry to provide the simulators, software and expertise needed to test these capabilities in the lab.
Over the next few weeks, we’ll draw on this experience in a series of blogs exploring four key aspects of testing position, navigation and timing effectiveness in CAV systems:
Testing PNT capabilities with hardware in the loop
Simulating GNSS multipath and obscuration effects
Over-the-air RF signal simulation using an anechoic chamber
Testing the efficacity of RTK and PPP to help to generate an accurate position
Read Blog #2 called "Testing position, navigation and timing with hardware in the loop for connected and autonomous vehicles"