Image source: datadesignsystems.com
The next smartphone that you buy will be much smarter with the new ARM machine learning processor. The British chip designing firm on Tuesday unveiled its first dedicated machine learning processors, which will be used in mobile phones and smart-home devices.
“These machine learning processors are meant for running machine learning models at the edge. The promise is that the new processors will be highly efficient and offer a mobile performance of 4.6 teraops,” said a media report, citing ARM.
Dubbed as ‘Project Trillium’ the processor has a combination of both hardware and software techniques that will speed up the machine learning and neural networks. The new chips will allow hardware to run offline and cut the lag inherent in sending information back and forth. According to a report, the new chips will also use less power than ARM’s other designs and also be better at moving data in and out of the memory.
“The company believes that putting machine learning into mobile devices is the best computing solution for the future. If we kept much AI in the cloud, or web connected data centers, then we would have to send too much data through the internet to feed those processors. We believe machine learning processors will outperform GPUs and CPUs,” Jem Davies, Vice President, fellow and general manager of ARM’s Machine Learning Group told a news agency in an interview.
At present the portable devices that use machine learning mostly depend on the cloud-based servers and does not have much horsepower to support artificial intelligence algorithm. The new chips will handle AI task locally, cutting down the dependency on cloud-based Google Assitant, Siri and Alexa.
According to a report, the company is sharing the new machine learning algorithms with its hardware partner, Qualcomm and the first chips will debut this year or in 2019.
Try deep learning using MATLAB