Jeff Dean, Qi Lu, Ilya Sutskever, and more, talks from ScaledML 2016
ht on Twitter
ht on Twitter
3 plus ones
Shared publicly•View activity
- Direct link to Jeff Dean's talk on TensorFlow:
- The contrast between Jeff's talk and Xavier's talk is interesting, one saying "If you’re not considering how to use deep neural nets to solve your vision or understanding problems, you almost certainly should be", the other saying, "Deep Learning is not the only solution. It is dangerous to oversell Deep Learning."
I actually think they're both right. Jeff is coming from the point of view of image and text processing where deep learning has had a lot of success (I think largely due to the way local information around each input data point is so important, so things like convolution layers work very well for aiding learning). Xavier is looking it at from the point of view of recommender systems, where deep learning has not yielded big gains (yet, and I think the reason largely because it's unclear how to build deep architectures that aid learning from this kind of sparse data). Where I think they will come head to head is understanding, so any kind of rich knowledge representation (beyond synonyms, which is the simple example of understanding Jeff gave in his talk, but I don't think is what most people mean by understanding).
Anyway, that's just my take. All the slides from all the talks are worth a read, and especially Xavier's, Jeff's, Ilya's, and Qi's.29w