Intelligence, Inside and Outside.

Google I/O 2019 | Cloud TPU Pods: AI Supercomputing for Large Machine Learning Problems

Google I/O 2019 | Cloud TPU Pods: AI Supercomputing for Large Machine Learning Problems

Cloud Tensor Processing Unit (TPU) is an ASIC designed by Google for neural network processing. TPUs feature a domain specific architecture designed specifically for accelerating TensorFlow training and prediction workloads and provides performance benefits on machine learning production use. Learn the technical details of Cloud TPU and Cloud TPU Pod and new features of TensorFlow that enables a large scale model parallelism for deep learning training.

 

Speakers: Kaz Sato and Martin Gorner

Session ID: TF6510

Read More  Google Cloud Next 2019 | Customer Stories: AI in Financial Services

For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!
Share this article
Shareable URL
Prev Post

AI Is Different Because It Lets Machines Weld The Emotional With The Physical

Next Post

Google I/O 2019 | Cutting Edge TensorFlow: New Techniques

Read next