Joe Philip's interests

Joe Philip's posts

Post has shared content

**2038: only 21 years away**

"Sometimes it seems that things have gone relatively quiet on the year-2038 front. But time keeps moving forward, and the point in early 2038 when 32-bit time_t values can no longer represent times correctly is now less than 21 years away. That may seem like a long time, but the relatively long life cycle of many embedded systems means that some systems deployed today will still be in service when that deadline hits. One of the developers leading the effort to address this problem is Arnd Bergmann; at Linaro Connect 2017 he gave an update on where that work stands."

https://lwn.net/SubscriberLink/717076/4c3593aa4cad8e66/

Post has shared content

This could be worth watching: Terence Tao is giving the presentation of the work of this year's Abel Prize winner when he or she is announced next Tuesday ...

Post has shared content

New blog post: Random-Walk #Bayesian Deep Networks: Dealing with Non-Stationary Data

http://twiecki.github.io/blog/2017/03/14/random-walk-deep-net/

http://imgur.com/a/UwbJl

http://twiecki.github.io/blog/2017/03/14/random-walk-deep-net/

http://imgur.com/a/UwbJl

Post has shared content

I don't like the way the concept of entropy (in the sense of information theory) is taught. I'll explain by analogy.

Area is the unique property of geometric figures that satisfies a bunch of "axioms" like "is additive", "is invariant under Euclidean motions", "if you scale something by a it's area scales by a^2" and so on. But nobody teaches area like this. The earliest lesson on area I can remember involved placing my hand on graph paper, drawing the shape of my hand, and then counting the number of squares in the shape. What's more, that lesson still corresponds well to how I use area today and it has direct descendants in the form of the Riemann and Lebesgue integrals.

Entropy often gets taught using some set of axioms including additivity (in some sense). They give some good intuitions, but I feel like at bottom there's something missing. In particular, you don't get the formula for entropy emerging of its own accord. Everything seems a little contrived.

To me, the thing that brings the most clarity is known as the "Type Theorem". It's in many (but not all) books on information theory and I provide a link below [1]. I think it's the appropriate analogue of counting squares for relative entropy. It starts with a natural question: if I draw n independent samples from some probability distribution on a finite set, and count how many times each outcome comes up, what is the probability distribution on the set of possible counts? Ie. what is the distribution of possible empirical distributions?

The Type Theorem answers this question. It says that to a good approximation the probability of any particular empirical distribution is a simple expression in the relative entropy between the empirical and proposed distributions (modulo some details you can read about). To me this makes everything clear. The form of the expression for entropy just pops out. And numerous applications of entropy outside of communications follow from this result. I think it should come before anything else in a course teaching entropy.

[1] https://blogs.princeton.edu/sas/2013/10/10/lecture-3-sanovs-theorem/

Area is the unique property of geometric figures that satisfies a bunch of "axioms" like "is additive", "is invariant under Euclidean motions", "if you scale something by a it's area scales by a^2" and so on. But nobody teaches area like this. The earliest lesson on area I can remember involved placing my hand on graph paper, drawing the shape of my hand, and then counting the number of squares in the shape. What's more, that lesson still corresponds well to how I use area today and it has direct descendants in the form of the Riemann and Lebesgue integrals.

Entropy often gets taught using some set of axioms including additivity (in some sense). They give some good intuitions, but I feel like at bottom there's something missing. In particular, you don't get the formula for entropy emerging of its own accord. Everything seems a little contrived.

To me, the thing that brings the most clarity is known as the "Type Theorem". It's in many (but not all) books on information theory and I provide a link below [1]. I think it's the appropriate analogue of counting squares for relative entropy. It starts with a natural question: if I draw n independent samples from some probability distribution on a finite set, and count how many times each outcome comes up, what is the probability distribution on the set of possible counts? Ie. what is the distribution of possible empirical distributions?

The Type Theorem answers this question. It says that to a good approximation the probability of any particular empirical distribution is a simple expression in the relative entropy between the empirical and proposed distributions (modulo some details you can read about). To me this makes everything clear. The form of the expression for entropy just pops out. And numerous applications of entropy outside of communications follow from this result. I think it should come before anything else in a course teaching entropy.

[1] https://blogs.princeton.edu/sas/2013/10/10/lecture-3-sanovs-theorem/

Post has shared content

Europa, We are coming for you. :)

Our upcoming mission to investigate the habitability of Jupiter's icy moon Europa now has a formal name: Europa Clipper. Details: http://go.nasa.gov/2lLwwto

Post has shared content

And with perfect timing after the announcement (just a few days ago!) of seven planets orbiting TRAPPIST-1, two of which may be in the "Goldilocks zone" for life, +Tim Blais is out with his latest video: Whole New Worlds, for all those exoplanet hunters out there.

Post has shared content

**A fantastic visual introduction to probability and statistics.**

Post has shared content

Learn tensor networks in 10 minutes by clicking on the link. bit.ly/2lliDxd

cover by +Lusa Zeglova

cover by +Lusa Zeglova

Post has shared content

Post has shared content

Writing in +Nature News & Comment, microbiologist Herman Goossens argues that scientists should

http://www.nature.com/news/shout-about-the-european-union-s-success-1.21479

**"Shout about the European Union’s success"**.http://www.nature.com/news/shout-about-the-european-union-s-success-1.21479

Wait while more posts are being loaded