Profile

Cover photo
Vitali Burkov
Worked at Sprut Technology
Attended СШ 35
Lives in Naberezhnye Chelny
25,185 views
AboutPostsPhotosYouTube

Stream

Vitali Burkov

Shared publicly  - 
 
Can't stop listening to this.
1
Add a comment...

Vitali Burkov

Shared publicly  - 
 
 
How Google Translate squeezes deep learning onto a phone. "If we could do this visual translation in our data centers, it wouldn't be too hard. But a lot of our users, especially those getting online for the very first time, have slow or intermittent network connections and smartphones starved for computing power. These low-end phones can be about 50 times slower than a good laptop -- and a good laptop is already much slower than the data centers that typically run our image recognition systems. So how do we get visual translation on these phones, with no connection to the cloud, translating in real-time as the camera moves around?"

"We needed to develop a very small neural net, and put severe limits on how much we tried to teach it -- in essence, put an upper bound on the density of information it handles. The challenge here was in creating the most effective training data. Since we're generating our own training data, we put a lot of effort into including just the right data and nothing more. For instance, we want to be able to recognize a letter with a small amount of rotation, but not too much."
2 comments on original post
1
Add a comment...

Vitali Burkov

Shared publicly  - 
 
 
40 years ago, the Wagner-Fischer algorithm for calculating how similar two genomes are -- known as the "edit distance" -- was invented, but ever since, computer scientists have been looking for a more efficient algorithm. Now, it's been proven that a more efficient algorithm doesn't exist... well, almost. What they actually proved is that if it's possible to solve the edit-distance problem in less-than-quadratic time, then it's possible to solve an NP-complete problem in less-than-exponential time. But most mathematicians don't think NP-complete problems can be done in less-than-exponential time.
Proof that a 40-year-old algorithm is the best possible will come as a relief to computer scientists.
View original post
2
Add a comment...

Vitali Burkov

Shared publicly  - 
 
<title>Antichrist</title>
 
"We'll train recurrent neural networks to generate text character by character and ponder the question "how is that even possible?"

PANDARUS:
Alas, I think he shall be come approached and the day
When little srain would be attain'd into being never fed,
And who is but a chain and subjects of his death,
I should not sleep.

Second Senator:
They are away this miseries, produced upon my soul,
Breaking and strongly should be buried, when I perish
The earth and thoughts of many states.

The source code is on Github so you can reproduce all the experiments yourself. You can train character-level language models based on multi-layer Long Short-Term Memory (LSTM) neural networks. You give it a large chunk of text and it will learn to generate text like it one character at a time.
1 comment on original post
1
Add a comment...

Vitali Burkov

Shared publicly  - 
 
Very interesting.
 
A lot of people ask me how they can have a great career. This article shares my thoughts on that, as well as how I think about life, innovation, and some of the lessons I've learned working on research.  I'm usually a fairly private person, and there's a lot in this article that I haven't shared before. Take a look: http://www.huffingtonpost.com/2015/05/13/andrew-ng_n_7267682.html
4 comments on original post
1
Add a comment...

Vitali Burkov

Shared publicly  - 
 
Good mourning
1
Add a comment...

Vitali Burkov

Shared publicly  - 
 
 
Business Cat always has some kind of issue going.
The Adventures of Business Cat is proudly supported by these good folks. The Adventures of Business Cat. Boardroom. Posted May 8, 2015 by Fonder. Engage emergency quarantine procedure. Giuliano. Finally! CinéfiloCascarrabias. yay! Allsmiles. (Hahahahahahaha! Oh I love this one!) ...
View original post
1
Add a comment...

Vitali Burkov

Shared publicly  - 
 
 
All of these images were computer generated!

For the last few weeks, Googlers have been obsessed with an internal visualization tool that Alexander Mordvintsev in our Zurich office created to help us visually understand some of the things happening inside our deep neural networks for computer vision.  The tool essentially starts with an image, runs the model forwards and backwards, and then makes adjustments to the starting image in weird and magnificent ways.  

In the same way that when you are staring at clouds, and you can convince yourself that some part of the cloud looks like a head, maybe with some ears, and then your mind starts to reinforce that opinion, by seeing even more parts that fit that story ("wow, now I even see arms and a leg!"), the optimization process works in a similar manner, reinforcing what it thinks it is seeing.  Since the model is very deep, we can tap into it at various levels and get all kinds of remarkable effects.

Alexander, +Christopher Olah, and Mike Tyka wrote up a very nice blog post describing how this works:

http://googleresearch.blogspot.com/2015/06/inceptionism-going-deeper-into-neural.html

There's also a bigger album of more of these pictures linked from the blog post:

https://goo.gl/photos/fFcivHZ2CDhqCkZdA

I just picked a few of my favorites here.
25 comments on original post
1
Add a comment...

Vitali Burkov

Shared publicly  - 
 
The brave new world
1
Add a comment...
People
Work
Employment
  • Sprut Technology
    programmer
Places
Map of the places this user has livedMap of the places this user has livedMap of the places this user has lived
Currently
Naberezhnye Chelny
Links
Contributor to
Story
Tagline
I am sprutman
Education
  • СШ 35
    2001
Basic Information
Gender
Male