Tensor Processing Units (TPUs) I'm very excited that we can finally discuss this in public. Today at Google I/O +Sundar Pichai revealed the TPU (Tensor Processing Unit), a custom ASIC that Google has designed and built specifically for machine learning applications. We've had TPUs deployed in Google datacenters for more than a year, and they are an order of magnitude faster and more power efficient per operation than other computational solutions for the kinds of models we are deploying to improve our products. This computational speed allows us to use larger, more powerful machine learned models, expressed and seemlessly deployed using TensorFlow (tensorflow.org) into our products, and to deliver the excellent results from those models in less time.
TPUs are used on every Google Search to power RankBrain (https://en.wikipedia.org/wiki/RankBrain), they were a key secret ingredient in the recent AlphaGo match against Lee Sedol, they are used for speech and image recognition, and they are powering a growing list of other smart products and features.
+Norm Jouppi and the rest of the team that developed this ASIC did a fabulous job, and it's great to see it discussed in public!
I'm very excited that my colleagues in Google DeepMind have been experimenting with TensorFlow for a while now, and after putting it through its paces for a bunch of research projects and experiments, they've decided to move all their future research work to TensorFlow.
"The distributed trainer also enables you to scale out training using a cluster management system like Kubernetes. Furthermore, once you have trained your model, you can deploy to production and speed up inference using TensorFlow Serving on Kubernetes."
Announcing Wide & Deep Learning in #TensorFlow - useful for generic large-scale regression and classification problems with sparse inputs, such as recommender systems, search, and ranking problems. Our Wide & Deep Learning implementation is open-sourced as part of the TF.Learn API so that you can easily train a model yourself. Please check out the Google Research blog, linked below, to learn more!
@martin_gorner Tu passes quand sur Lannion pour un fight avec PredicSis ? ;) http://www.predicsis.com/blog/2015/9/29/machine-learning-wars-amazon-ml-vs-google-api-vs-bigml-vs-predicsis-api … Sergii. 7h7 hours ago. Sergii @lc0d3r. @martin_gorner Are you also planning to release a video?
What does one name an English language parsing model, built with an open-source neural network framework implemented in #TensorFlow that provides a foundation for Natural Language Understanding systems? Parsey McParseface, of course!
Introducing TensorFlow Playground, a nice visualization system that teaches you about neural networks and about the optimization process for training these models. Fun to play with! (Be sure to click the big "Play" button to get it started)