We are back to the battle of graphics processing units vs field-programmable gate array (FPGA) with NVIDIA in the centre of attention. This time, there’s more at stake than just FPGA power efficiency factor as compared to GPUs cost efficient advantage. The recent tech boom with cloud computing, image processing, robotics, deep learning and big data workloads have necessitated a shift towards GPUs.
But that doesn’t rule out FPGAs which are designed to perform fixed-point operations with a close-to-hardware programming approach. GPUs on the other hand, are optimised for parallel processing of floating-point operations using thousands of small cores, according to this whitepaper. FPGAs also score over GPUs in terms of interface flexibility, improved by the integration of programmable logic with CPUs and standard peripherals. They also provides huge processing capabilities with a great power efficiency. But when it comes to cost efficiency, GPUs are winning the battle in development and software acceleration while FPGAs require a team of design engineers. Another thing to be considered is that, today many algorithms are designed directly for GPUs, and FPGA developers are difficult and expensive to hire.
Is FPGA Suitable For Self Driving Technology?
This is not the age-old comparison between GPU vs FPGA which is not the crux of the matter. FPGAs which can perform a range of logical functions simultaneously are being considered unsuitable for emerging technologies such as self-driving cars or deep learning applications. At the recently-concluded GPU Technology Conference in March, NVIDIA top brass Jensen Huang declared that FPGA is not the right answer for autonomous tech development.
According to Huang, FPGA was more suited for prototyping and wouldn’t lead to desirable results. With leading tech giants like Intel, Qualcomm, Nvidia and Google plowing in millions of dollars in self-driving technology and AI chips, it is evident that chipmakers would want to design a chip for autonomous driving technology.
Today, thanks to its immense popularity and cost-efficiency, GPU has emerged as the dominant chip architecture for autonomous technology and Nvidia has intensified the battle by increasing the computing speed 10 times and reducing the power consumption 16 times. They have further strengthened their position in the automotive market with partnerships with Audi, ZF and Zenrin and are building an AI-driven big data system.
Big Players Dominate The Self Driving Market, But Smaller Players Are Emerging
The chip market is dominated by players like Intel, AMD, Google, Altera and Cambricon among others, but Nvidia is the market leader. Besides GPU, CPU and FPGA, ASIC is another mainstream chip that is fast becoming the industry standard for self-driving technology. One of the key advantages of ASIC chips is that they are best suited for computing and power consumption. In the near future, industry analysts believe research and development of all AI chips will gain traction as autonomous technology and certain use cases would drive specialised AI chips.
Until autonomous cars reach the stage of mass production, various chip architectures are possible. For example, Chinese startup Horizon Robotics, leader in embedded AI released an AI processor named Journey 1.0 for smart driving, and another one called Sunrise 1.0 for smart cameras. Now, in an automotive setting the two chips work together and the Chinese startup which got its latest funding from Intel Capital claims Journey 1.0 has a detection accuracy of more than 99 percent for vehicles, pedestrians, lane lines and traffic signs.
Qualcomm is not a small player but the chipmaker doesn’t want to be left behind in driving autonomous technology forward. The leading chipmaker of choice for smartphones, the American semiconductor player launched Drive Data platform, dubbed a variant of Snapdragon processor in 2016. As part of its automotive transition, the Drive Data platform builds on Qualcomm’s Snapdragon 820 Automotive processor and delivers the next level of intelligence and unprecedented mobile connectivity for autonomous driving. And in order to dominate the market like Nvidia and Intel, Qualcomm is providing a full platform that brings together all the key capabilities such as HD Mapping and machine intelligence for safe navigation.
Try deep learning using MATLAB