ARM working on new chip to support AI, ML workloads on mobile devices

ARM working on new chip to support AI, ML workloads on mobile devices
15 Feb, 2018

British chip designer ARM Holdings Ltd is working on a new project called Trillium to develop chips that can support artificial intelligence (AI) and machine learning (ML) workloads on mobile devices.

ARM doesn't make chips but instead licences its designs to other manufacturers. It joins the likes of Nvidia, Amazon, Google and Apple in adding AI and ML abilites to smartphones, tablets and other Internet of Things devices which could become more efficient if they are able to crunch data for decision-making independently rather than passing data to the cloud.

The concept of data crunching near the source of the data is known as Edge computing. According to ARM, the world is moving towards this approach owing to the sheer cost of scaling up infrastructure in terms of power, bandwidth and expenses.

Edge computing also promises faster response times from the device. In other words, it reduces latency.

"Google realised that if every Android device in the world performed three minutes of voice recognition each day, the company would need twice as much computing power to cope," said Jem Davies, vice president, fellow and general manager, machine learning at ARM.

"And, to be reliable, ML cannot be dependent on a stable Internet connection, especially when it is governing safety-critical operations," he added.

As part of Trillium, the SoftBank-owned ARM is working on a ML chip and an Object Detection (OD) chip.

While the ML chip aims to speed up workloads such as natural language processing and facial recognition, the OD chip will look at identification of different kind of objects and people. 

ARM said that the ML chip will be available in the middle of the year and the OD chip will be available to manufacturers by the end of February.

In terms of specifications for the ML chip, it will allow device users to "run almost five trillion operations per second (TOPs) within a mobile power budget of just 1-2 watts", making it ideal for low-power devices such as smartphones or tablets. 

"That’s clearly vital for products such as dive masks (AI driven to identify wildlife underwater) but also important for any device, such as an autonomous vehicle, that cannot rely on a stable internet connection," said Davies.

Other companies such as Google, Amazon and Intel have also been ramping up their AI and ML efforts. 

Google has started offering its new artificial intelligence-tailored chips on its cloud platform to other companies for advanced testing as part of its effort to accelerate machine learning models and get them running faster.

Just last week, Google merged its smart home devices unit Nest with its hardware team in a bid to outgun Amazon's Alexa.

Separately, Amazon is reportedly developing an artificial intelligence-powered chip that will bolster devices using its smart assistant, Alexa, amid efforts to consolidate its lead in the consumer-facing AI segment and keep Google at bay.

Intel Corporation has also launched a new chip aiming to power up applications that can process data on the device itself instead of using the cloud.

Named Intel Xeon D-2100, the processor is designed for zero-lag Edge computing.