- Get link
- X
- Other Apps
Posted by
VontiRamVenkat
on
- Get link
- X
- Other Apps
The Softbank-owned company says the current boom is a ‘once-in-a-generation inflection’
British chip designer ARM is the latest firm to prime
the AI pump with specialized hardware, unveiling two new processor
designs today that it promises will deliver “a transformational amount
of compute capability” for companies building machine learning-powered
gadgets.
The designs are for the ARM Machine Learning (ML)
Processor, which will speed up general AI applications from machine
translation to facial recognition; and the ARM Objection Detection (OD)
Processor, a second-generation design optimized for processing visual
data and detecting people and objects. The OD processor is expected to
be available to industry customers at the end of this month, while the
ML processor design will be available sometime in the middle of the
year.
“These are new, ground-up designs, not based on existing
CPU or GPU architectures,” ARM’s vice president of machine learning, Jem
Davies, told The Verge.
As with all of ARM’s chips, the company will not be
making the processors itself, but it will instead license the designs to
third-party manufacturers. In the past, ARM’s customers have included
chipmakers like Broadcom, but also hardware firms like Apple, which
tweaks ARM’s designs for its own devices. The ML processor will
primarily be of interest for makers of tablets and smartphones, while
the OD processor could be put to a more diverse range of uses, from
smart security cameras to drones.
Davies said the company was already in talks with a
number of phone makers interested in licensing the ML chip, but would
not name any specific firms. At the moment, specialized AI processors
only appear in high-end devices, like Apple’s latest crop of iPhones and Huawei’s Mate 10.
But, Davies is confident that the ubiquity of AI applications is going
to mean that these chips will quickly become standard-issue across a
range of price points.
Read more: What does a mobile AI chip actually do?
“Our belief from talking to the market is that this will trickle down very, very fast indeed,” Davies told The Verge. “In China they’re already talking about putting this in entry-level smartphones from next year.”
These
chip designs will not just be useful for smartphones, though, and will
help power the next generation of Internet of Things (IoT) devices. Like
many companies developing AI chips, ARM is evangelical about the
importance of edge computing — meaning processing is done on-device,
rather than sending data back to the cloud. This has been a big factor
in phone companies’ adoption of AI chips, as on-device computation has a
number of advantages over cloud computing. It’s more secure, as the
data can’t be intercepted in transit; it’s quicker and more reliable, as
users don’t have to wait for their data to be processed by remote
servers; and it costs less — for both the customer and the provider.
“Google said
that if every user used voice search for just three minutes a day the
company would have to double the number of servers it has,” notes
Davies. As more smart devices start running more intensive AI
applications, he says, “there just won’t be enough bandwidth available
online. You’re going to break the internet.” Davies adds that although
today’s chip designs are targeted at mobile devices, the broader chip
architecture could scale up to provide AI chips for servers as well.
Patrick Moorhead, principal analyst at Moor Insights & Strategy, told The Verge
that the new chip designs made sense for ARM, as more companies
transition their computational workload from analytics to machine
learning. However, he thought that the impact these chips would have on
the mobile industry would be limited. “The mobile market is flat now,
and I think this new kind of capability will help drive refresh
[consumers upgrading their phones] but not increase sales to where
smartphones are growing again,” said Moorhead.
ARM, of course, isn’t alone in trying to ride the AI wave with optimized silicon. Qualcomm is working on its own AI platform; Intel unveiled a new line of AI-specialized chips last year; Google is building its own machine learning chips for its servers; and, trying to take advantage of this moment of upheaval, ambitious startups like Graphcore are entering the industry, fueled by venture capital and eager to unseat incumbents.
As Davies puts it, this is a “once-in-a-generation
inflection,” with everything to play for. “This is something that’s
happening to all of computing.”
- Get link
- X
- Other Apps
Comments
Post a Comment