Google unveils Edge TPUs (Tensor Processing Units), AI in chips for devices and products to process data more quickly. The chip’s initial use is in industrial manufacturing, and LG is already testing it in a system that detects manufacturing defects in glass for displays.
Google started using TPUs two years ago to accelerate certain workloads in its own data centers, avoiding commercially available hardware from vendors like Nvidia.
The Edge TPUs are computationally less intensive tiny chips that are specifically designed to do the prediction part of AI. With Edge TPUs, applications in devices and gadgets run faster and more reliably, as a major portion of the computation happens at the device-level. Thus, the devices connecting to a bunch of other devices or the internet for computational purposes can be cut down significantly.
“Google isn't making the Edge TPU to compete with traditional chips… It's very good for all the silicon vendors and the device makers,” says Injong Rhee, Samsung’s former CTO who joined Google as an Entrepreneur-In-Residence in February. Edge TPUs might be “disruptive for the cloud competition,” says Rhee, as most of the computation happens at the edge, and not at the data centers.
The Google chips are designed using a simplified version of the TensorFlow AI software. The Edge TPUs can be more efficient with certain types of computing than the traditional chips. Amidst challenging technology like that of Microsoft’s chip for IoT devices, Edge TPUs’ less energy usage and cost-effectiveness will attract many consumers, especially the industries.