At Google I/O today, Google CEO Sundar Pichai provided an update on next generation of the custom-made Tensor Processing Units (TPU) which Google uses power its Google Compute Engine.
These new cloud TPUs feature four chips on a single board and are capable of generating 180 teraflops (180 trillion floating-point operations per second — the top-end NVIDIA GTX Titan X GPU runs at just 11 teraflops). Furthermore, Google has managed to link 64 of these TPUs into one TPU Pod super computer, for a combined processing power of 11.5 petaflops. Pichai says this new technology "lays the foundation for significant progress".
Google first announced TPUs at 2016's Google I/O.
Get the best of Android Central in in your inbox, every day!
Thank you for signing up to Android Central. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.