Profile cover photo
Profile photo
Vitali Burkov
I am sprutman
I am sprutman
About
Vitali's interests
View all
Vitali's posts

Post has shared content
In case you haven't seen it, the Distill Journal (http://distill.pub), created and edited by Google Brain team members +Christopher Olah and +Shan Carter, launched earlier this week. It's a totally new kind of journal and presentation style for machine learning research: online, interactive, and encouraging of lucid and clear presentations of research that go beyond using a Gutenberg-era presentation medium for ML topics. I'm really excited about this, and Chris and Shan have put a ton of work into this. The reactions so far have been quite amazing and positive

"Yes. Yes. Yes. A million times yes. I can't count how many times I've invested meaningful time and effort to grok the key ideas and intuition of a new AI/DL/ML paper, only to feel that those ideas and intuitions could have been explained much better, less formally, with a couple of napkin diagrams.... I LOVE what Olah, Carter et al are trying to do here." (Hacker News)

"I really love this effort. Research papers are low bandwidth way to get information into our brains..." (Hacker News)

"finally, someone gets it!! we need to COMMUNICATE research CLEARLY" (Twitter)

"My gosh, interactive dataviz is now the core of an academic journal. Thank you @shancarter & @ch402 & @distillpub!" (Twitter)

"This new machine learning journal is seriously exciting; an emphasis on clear explanation & interactive illustration" (Twitter)

"'Research Debt' - I am curious where @distillpub will go but I really like this essay by @ch402 & @shancarter" (Werner Vogels, CTO of Amazon, Twitter)

Blog posts announcing Distill:

Google Research: https://research.googleblog.com/2017/03/distill-supporting-clarity-in-machine.html

OpenAI: https://blog.openai.com/distill/

Y Combinator: https://blog.ycombinator.com/distill-an-interactive-visual-journal-for-machine-learning-research/

DeepMind: https://deepmind.com/blog/distill-communicating-science-machine-learning/

Chris Olah's blog: http://colah.github.io/posts/2017-03-Distill/



Post has shared content

Post has shared content

Post has shared content
Google put machine learning on the new Android Wear 2.0 smartwatches. Apparently they did this without modifying the hardware, but instead by coming up with a new "lightweight, machine learning architecture." Key to this is that the training is offloaded to powerful cloud services, while real-time interaction is done on the device.

Post has shared content

Post has shared content
Is SqueezeNet worth the squeeze?

AlexNet accuracy with 50x fewer parameters is impressive. Let's give it a go.

https://github.com/chasingbob/squeezenet-keras
Photo

Post has shared content

Post has shared content

Post has shared content
Nice

Does anyone else experience OutOfMemory proplem with Delphi Berlin? This is ridiculous. Making the IDE 3GB ready and not being able to fix memory leaks. Pathetic.
Wait while more posts are being loaded