Google has teamed up with Movidius, a company specializing in machine vision and deep learning technology, to bring more intelligence to mobile devices. With this partnership, the hope is that more aspects of machine intelligence will be able to run locally on a phone, rather than needing to rely on cloud processing.
As part of the roadmap, Movidius says that future devices should be able to understand both images and sound quickly, with most of the computation taking place on the device using a low-power chip.
This agreement enables Google to deploy its advanced neural computation engine on Movidius' ultra-low-power platform, introducing a new way for machine intelligence to run locally on devices. Local computation allows for data to stay on device and properly function without internet connection and with fewer latency issues. This means future products can have the ability to understand images and audio with incredible speed and accuracy, offering a more personal and contextualized computing experience.
Google will source processors from Movidius for this as part of the agreement, starting with the company's latest chip, the MA2450. Additionally, Google will contribute to the Movidius neural network technology roadmap.