A couple of years ago, Google unveiled Tensor Processing Units (TPUs), the specialized chips that reside in the company’s data centers. They are designed to perform light AI tasks. Now, Google is moving AI expertise from cloud to computers with its new Edge TPU, which is a tiny AI accelerator for on-device machine learning on IoT enabled devices.
The Edge TPU chipset will carry out a task called “inference” where the algorithm is trained to do things like recognizing an object in the picture. So, while the TPUs in the server train the AI, the Edge TPUs will do the inference. And as mentioned above, these chips will be used in enterprise jobs, not in your next smartphone. One of the real-world applications of the chipset could be to automate quality control checks in factories.
On-device machine learning has several advantages. For instance, it eliminates the use of sending the data over the internet for analysis. And because the process is carried out on the device itself, the data is more secure, results are faster, and downtimes can also be less. Google isn’t the only one who is designing AI chips, Qualcomm, MediaTek, and ARM among others have their chipsets that focus on making AI accelerators.
However, there is one area where Google has an advantage over its rivals – the ability to control the entire AI stack. For instance, users can store the data on Google Cloud to train the TPUs, and then carry out on-device inference using the new chipsets.
All of this will be enabled using Cloud IoT Edge, a software for powerful data processing and deploying on-device machine learning capabilities. It can run on Linux OS and Android Things-based devices. The development kit will be made available to developers.