AutoConnect_Magazine February 2016 | Page 15

systems is called sensor fusion.

This fusion allows the onboard software to accurately understand the full 360° environment around the car to produce a robust representation, including static and dynamic objects, (parked cars and pedestrians for example). The system also uses something called Deep Neural Networks (DNN) for the detection and classification of objects which can dramatically increase the accuracy of the resulting fused sensor data.

DNN is the ability for a machine to learn through onboard algorithms, helping it to develop, and, in the case of autonomous cars, prevent crashing.

“We now have over 50 car makers, suppliers and research groups doing work with Drive PX; it's pretty incredible how quickly we've reached a critical mass and getting people into developing prototype vehicles and doing real trials,” says Shapiro.

But that was last year. Nvidia has now launched the second-generation Drive PX 2, which takes the system to the next level, and more importantly onto public roads.

Drive PX 2 provides the same processing power as 150 MacBook Pros and is made up of two next-generation Tegra processors plus two next-generation GPUs, based on Nvidia's latest Pascal architecture. It's capable of delivering up to 24 trillion deep learning operations per second. That's over 10 times more computational horsepower than the previous-generation.

The system's capabilities should enable it to quickly learn how to address the challenges of everyday driving, such as unexpected road debris, erratic drivers and construction zones. Deep learning also addresses numerous problem areas where traditional computer vision techniques are insufficient, such as poor weather conditions like rain, snow and fog, and difficult lighting conditions such as sunrise, sunset and extreme darkness.

Add to this that for general purpose floating point operations, Drive PX 2's GPU architecture is capable of up to 8 trillion operations per second, over four times more than the previous-generation technology, it enables car firms to address the full breadth of autonomous driving algorithms, including sensor fusion, localisation and path planning.

Volvo will be the first to make use of the technology as it puts 100 fully autonomous vehicles on the roads of Gothenburg, each one having to deal with real traffic conditions and react to other drivers in real time.

But while Nvidia might supply the computational power it'll be down to the individual car companies as to whether autonomous technology is a success.

“We supply the hardware, the software development tools, and then each of our customers go on and create their individual product. I think we're going to see a shift in how they compete. We're going to see the race to offer new features and new self-driving modes over time,” says Shapiro.

We're going to see a lot more of Nvidia in the future, whether powering infotainment functions or entire vehicles, so it might be wise to check the specs of your next car for the company's name and check what chip your car is using.

"We now have over 50 car makers, suppliers and research groups doing work with Drive PX; it's pretty incredible how quickly we've reached a critical mass"

Danny Shapiro, automotive director at Nvidia