Qualcomm opens up its AI optimization software, says dedicated mobile chips are coming

From The Verge: In the race to get AI working faster on your smartphone, companies are trying all sorts of things. Some, like Microsoft and ARM, are designing new chips that are better suited to run neural networks. Others, like Facebook and Google, are working to reduce the computational demands of AI itself. But for chipmaker Qualcomm — whose processors account for 40 percent of the mobile market — the current plan is simpler: adapt the silicon that’s already in place.

To this end the company has developed what it calls its Neural Processing Engine. This is a software development kit (or SDK) that helps developers optimize their apps to run AI applications on Qualcomm’s Snapdragon 600 and 800 series processors. That means that if you’re building an app that uses AI for, say, image recognition, you can integrate Qualcomm’s SDK and it will run faster on phones with compatible processors.

Qualcomm first announced the Neural Processing Engine a year ago as part of its Zeroth platform (which has since been killed off as a brand). From last September it’s been working with a few partners on developing the SDK, and today it’s opening it up to be used by all.

“Any developer big or small that has already invested in deep learning — meaning they have access to data and trained AI models — they are the target audience,” Gary Brotman, Qualcomm’s head of AI and machine learning, told The Verge. “It’s simple to use. We abstract everything under the hood so you don’t have to get your hands dirty.”

The company says one of the first companies to integrate its SDK is Facebook, which is currently using it to speed up the augmented reality filters in its mobile app. By using the Neural Processing Engine, says Qualcomm, Facebook’s filters load five times faster than compared to a “generic CPU implementation.”

View: Article @ Source Site