Apple uses its chip expertise to move ahead in AI
May 27, 20171.5K views0 comments
Mark Gurman, who seems to have quite a few reliable contacts around the Valley, reports in Bloomberg that Apple (NASDAQ:AAPL) is working on a hardware acceleration chip for artificial intelligence (AI) processing. The chip is known internally as the Apple Neural Engine. The chip could be incorporated into Apple mobile devices such as iPhone and iPad either as a discrete chip or integrated with Apple’s custom systems on chip (SOCs).
Although Gurman only cites anonymous sources familiar with the matter, I consider the report very credible. Apple has for some time been seeking ways to build more AI functionality into the individual mobile device.
The motivation for this is customer privacy. Currently, AI assistants such as Siri, Cortana, Google Assistant, and Alexa are all hosted in the cloud and require Internet connections to access. The simple reason for this is that AI functionality requires a lot of processing horsepower that only datacenters could provide.
But this constitutes a potential privacy issue for users, since cloud-hosted AIs are most effective when they are observing the actions of the user. That way they can learn the users’ needs and be more “assistive”. This means that virtually every user action, including voice and text messaging, could be subject to such observation.
This has prompted Apple to look for ways to host some AI functionality on the mobile device, where it can be locked behind the protection of Apple’s redoubtable Secure Enclave. The barrier to this is simply the magnitude of the processing task.
Apple’s ARM based mobile SOCs have become enormously more powerful since the first A4 in 2010, but are still under powered for AI. However, they’re close enough to contemplate future capability that moves the AI assistant into the mobile device.
Nvidia (NASDAQ:NVDA) is currently leading the way in mobile AI with its Xavier SOC for the forthcoming Drive PX 3. Nvidia believes that Xavier will have sufficient processing power to host Level 4 autonomous vehicle AI software. That’s no small achievement given its 30 W power requirement, and probably more AI capability than Apple needs to power a local Siri-like assistant.
It’s also still too high a power requirement for a smartphone, which is probably why Apple is looking to develop its own chip. This chip may operate along the same lines as Google’s (NASDAQ:GOOG) Tensor Processing Unit (TPU) that accelerates certain specialized mathematical operations useful in AI.
Nvidia’s Xavier also will feature its own form of TPU that it calls a Tensor Core, and this is built into the SOC. So an Apple designed TPU device that could be integrated into its SOC is a natural approach for a future A series SOC.
Apple gaining on its competitors in AI
Last year, one of the negative narratives about Apple was that it was hopelessly behind in AI. I wrote an article stating my expectation that Apple will eventually catch up and surpass its competitors. Apple is still arguably behind its competitors, but it appears to be gaining on them.
Apple’s director of AI research is Ruslan Salakhutdinov, who is also an associate professor at Carnegie Mellon. He joined Apple only last October and recently presented at the MIT Technology Review conference. While he avoided revealing any specific Apple initiatives, his comments reveal his concerns that AI software requires enormous computing power. The specialized AI accelerator chip may be his brainchild.
If Apple does build this chip, it could be a first for a mobile processor. Gurman’s article incorrectly states that Qualcomm (NASDAQ:QCOM) already has hardware AI acceleration in its Snapdragon 835. Not really. Like most other mobile SOCs, the 835 has a number of processors, including a multi-core CPU, GPU cores, an Image Signal Processor (ISP) that performs high speed processing of camera data, and a Digital Signal Processor (DSP). All of these may be used for AI computation, but they’re not specifically AI hardware accelerators. Qualcomm’s Snapdragon Neural Processing Engine that Gurman refers to is in fact a set of software APIs.
If Apple does incorporate such a chip into its SOCs, it will have taken a very significant step in closing or even eliminating the perceived gap between itself and its competitors. Designing and building such a chip highlights Apple’s advantages as an enormously profitable and vertically integrated mobile device maker. Betting against Apple’s ability to leverage these advantages in AI is a mistake, I believe. I continue to be long Apple and recommend it as a buy.
Thank you, TSMC
It appears that Apple’s silicon foundry partner for its mobile SOCs, Taiwan Semiconductor Manufacturing Company (NYSE:TSM) has been a little naughty. According to the Chinese language Economic Daily News (as reported by Digitimes), design features of the next generation iPhone were discussed at a TSMC Technology Symposium held on May 25 in Taiwan.
Apparently TSMC let fall during the symposium that the future iPhone (presumably iPhone 8) will indeed delete the home button in favor of a screen-embedded fingerprint reader. The new iPhone will also come with infrared sensors to support augmented reality applications. The screen aspect ratio was reported to be 18.5:9, which just happens to be the same as the Samsung (OTC:SSNLF) Galaxy S8.
Digitimes is not exactly the most reliable source of tech info on the planet, so these rumors can be taken with a grain of salt. But if true, we can thank TSMC for cracking open the door on iPhone 8 just a little bit. I hope Apple’s management isn’t too annoyed.
TSMC is part of the Rethink Technology portfolio and is a recommended buy.