## Profile

Terence Tao
Works at UCLA
Attended Princeton University
Lives in Los Angeles
31,659 followers|3,938,591 views

## Stream

### Terence Tao

Shared publicly  -

One of the big developments in complexity theory from last year late 2015 is unfortunately slightly less big than first thought, though still significant.﻿

A retraction to the quasi-polynomial graph isomorphism claim of Babai (replaced by sub-exponential time).﻿
In December 2015 I posted a manuscript titled Graph Isomorphism in Quasipolynomial Time (arXiv:1512.03547) (v1:11 Dec 2015, v2:19 January 2016). The title states the claimed result. A revised analysis of the (slightly modified) algorithm shows that it runs in subexponential but not ...
View original post
30
8

We will probably see more and more of such public proof revising in the future.﻿

### Terence Tao

Shared publicly  -

Jean Bourgain wins 2017 Breakthrough Prize

Congratulations to Jean Bourgain (Princeton Institute for Advanced Study), who has won the 2017 Breakthrough Prize in mathematics!

https://www.ias.edu/press-releases/2016/bourgain-breakthrough﻿
Jean Bourgain, IBM von Neumann Professor in the School of Mathematics, has received the 2017 Breakthrough Prize in Mathematics, for “multiple transformative contributions to analysis, combinatorics, partial differential equations, high-dimensional geometry, and number theory.” A member of the Institute Faculty since 1994, Bourgain was cited by the prize
View original post
39
3

Interesting why did Sundar Pichai show up on the ceremony?﻿

### Terence Tao

Shared publicly  -

There is a classic paper of Shannon from 1950 (linked below) in which he estimates the entropy of the English language to be about 1 bit per character. Roughly speaking, this means that for medium sized n (e.g. 50 to 100), the distribution of n-grams - strings of n consecutive letters in an English text, after stripping punctuation and spacing) - is concentrated in a set of size about 2^n (this can be compared with the number of all possible n-grams using the Roman alphabet, which is 26^n). In particular, any two randomly selected n-grams from two different English texts should have a probability of about 2^{-n} of agreeing with each other from purely random chance.

As a consequence, given two English texts of character length N and M, the probability that they share a matching n-gram purely from random chance is approximately N M 2^{-n} (assuming that n is large enough that this product is significantly less than 1, but still small compared to N and M). One can view this quantity NM 2^{-n} as a crude measure of the "p-value" when testing for the hypothesis that a match is occurring for a reason other than random chance.

It is tempting to apply this formula as a test for plagiarism, when a matching n-gram is found between two texts (as was recently the case involving the convention speeches of Melania Trump and Michelle Obama). One can take N to be the number of characters of the text to be tested for plagiarism (in the case of Trump's speech, this is about 2^{12.5}). It would however be unfair to take M to be the length of the text in which a match was found (in this case, Obama's speech, which is about 2^13 characters), because this is not the only reference text in which would be available for plagiarism (this is a variant of the "Texas sharpshooter fallacy"). For maximum benefit of the doubt, one could take M to be something like the total number of characters of English text on pages indexed by a major search engine; this is a bit difficult to estimate, but it seems that there about 10^9 to 10^10 English web pages indexed by (say) Google, which on average contain about 10^3 to 10^4 characters of text, so one could very roughly take M to be about 10^13, or about 2^43. (This is probably an overestimate due to redundancy of text on the web, as well as the unsuitability of much of this corpus for the purpose of using in a political speech, but I have no way to easily estimate the level of such redundancy or unsuitability.)

Using this corpus, we can then estimate the maximum n-gram match one should have between a speech such as Trump's or Obama's and the entirety of the indexed internet to be about 65-70, with the p-value of a match decreasing exponentially by roughly a factor of two for every additional character beyond this range.

In the case of Trump's speech, it turns out that the longest n-gram match between Trump and Obama is 81 characters long ("values that you work hard for what you want in life your word is your bond and you do what you say", minus the spaces). This suggests a p-value of at most 0.001. But this is an extremely rough estimate, given the large margins of error in the quantities used above, though it is probably on the conservative side as it is using a rather large corpus for M, and is also ignoring the other matching or nearly matching n-grams between the two texts.

There have been some attempts to counter the accusations of plagiarism by pointing out that the speech of Obama (or of other speakers) also contain matches with prior texts; however the matches are much shorter in length (e.g. the phrase "the world as it should be" appears both in the speech of Obama and in a speech of Saul Alinsky), and given the exponentially decaying nature of the p-value, such shorter matches are completely explainable through random chance. A matching n-gram that is even just 10 characters shorter than the one in Trump's speech, for instance, would be about 1000 times more likely to occur by random chance; one that is 20 characters shorter would be about one million times more likely to occur by chance; and so forth. This is certainly a situation in which the level of significance does not behave linearly with the length of the matching text!﻿
109
26

it is beautious .i like this﻿

### Terence Tao

Shared publicly  -

An impressively huge lockdown here at UCLA.﻿
Android users, if the above player does not work, please click here. Police are investigating a possible shooting at UCLA’s Engineering IV building Wednesday morning. The campus is on lockdown and…
32
3

Citizens are not allowed to own guns all over the country.Only police can.So some local governments are able to pull down people's house without fear,for example,last month a group of police in Hainan province pull down people's house by using forces,women and children are beaten,they cried in vain.But all in all,I think guns should be banned.﻿

### Terence Tao

Shared publicly  -

It's always been a bit of a mystery why the polynomial method, which has had many impressive successes in the incidence geometry of finite field vector spaces, has not been able to make progress on the capset problem (bounding the size of subsets of Z_3^n that are free of three-term arithmetic progressions). In this paper of Croot, Lev, and Pach, they manage to use the polynomial method to get a surprisingly strong bound for the analogous problem in Z_4^n (far surpassing the Fourier-analytic techniques which have long held the title to best bounds for these Roth theorem type results). Currently the method is restricted to the 4-torsion case, but it gives hope that some other deployment of the polynomial method could be effective for other groups, and particularly for the capset problem. #spnetwork arXiv:1605.01506﻿
Selected Papers Network
Progression-free sets in Z_4^n are exponentially small. Ernie Croot, Vsevolod Lev, Peter Pach. We show that for integer $n>0$, any subset $A \subset Z_4^n$ free of three-term arithmetic progressions has size $|A|<2(\sqrt n+1) 4^{c n}$, with an absolute constant $c \approx 0.926$.
42
11

So amazing!!!﻿

### Terence Tao

Shared publicly  -

In case one likes a little bit of suspense, the Abel prize for this year will be announced about eight hours from now (Tue 11am GMT) on the linked website.  (I am not involved with the prize selection this year and have no idea who the recipient will be.)﻿
45
6

### Terence Tao

Shared publicly  -

This would have been a useful video to share with my (recently concluded) graduate complex analysis class.﻿
90
26

I am a UG student, in this semester, I have a Complex Analysis paper. And at the very beginning of the course, this video has provided me the required motivation towards the subject, so Thank You and please post this sort of articles more frequently...﻿

### Terence Tao

Shared publicly  -

Open to students between 13 and 18 in most countries; submission deadline is October 10. First prize is a \$250,000 college scholarship, together with an award to a teacher that inspired the student, and funding for a science laboratory at the student's school.﻿
20
6

Mathematicians are morons. They have not figured out yet, that the world does not agree on numbers. A Billion is not the same!!! It depends on the language. How stupid it that?﻿

### Terence Tao

Shared publicly  -

When I became naturalised as a US citizen in 2009, I received at the ceremony a little bag containing (among other things) a small US flag and a pocket copy of the US constitution. A few years ago, while doing some housecleaning, I decided to discard the copy of the constitution, reasoning that the text was easily available online whenever I needed to read it. (Also, I had previously studied the constitution in order to pass the citizenship exam.)

... I think I now regret that decision.﻿
73
4

Dear Professor,Trump wins.If you come to and live in China,you will be very welcomed by the Chinese people :-)﻿

### Terence Tao

Shared publicly  -

Correlation does not imply causation; nevertheless, this is quite a striking correlation between trust in experts and Brexit voting intentions (from June 16). (Note: the actual data http://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/x4iynd1mn7/TodayResults_160614_EUReferendum_W.pdf has a more refined breakdown depending on what type of "expert" is being considered, but the correlation persists across types.)﻿
81
27

All the best

﻿

### Terence Tao

Shared publicly  -

Maths moves at a faster pace these days... only eight days after the breakthrough of Croot, Lev, and Pach obtaining for the first time an exponentially strong upper bound on the cap set problem in (Z/4Z)^n, Jordan Ellenberg has adapted the argument to the original setting of (Z/3Z)^n; we now know that the smallest capset in this space has size between 2.2174^n and 2.756^n asymptotically. (I blogged about the cap set problem in one of my first blog posts way back in 2007: https://terrytao.wordpress.com/2007/02/23/open-question-best-bounds-for-cap-sets/ .) The argument looks like it will also extend to other finite fields than Z/3Z.

Like Dvir's argument solving the finite field Kakeya problem, it ends up being a very short (3 pages) application of the polynomial method. Basically, the point is that if A is a capset (avoiding 3-term progressions), and P(x) is some polynomial that vanishes outside of -A but is nonzero for many values of -A, then the polynomial P(x+y) vanishes on all of A x A except on the diagonal, where it has a lot of nonzeroes. But this forces the rank of the matrix P(x+y) for x,y in A to be large, which in turn forces the degree of P to be large. The usual linear algebra and dimension counting of the polynomial method then finishes off the proof.

The original problem of improving the bounds in Roth's theorem remains open (much as the Euclidean Kakeya problem does), but it at least gives hope that the current logarithmic-type upper bounds for Roth's theorem (based on Fourier methods) can be improved, given that the Fourier methods have finally been dethroned from their long-held title as claimants to the strongest quantitative bounds in the finite field setting.﻿
77
19

### Terence Tao

Shared publicly  -

Among the many nice things I liked about this result: (a) it uses Fourier-analytic techniques to obtain a sharp constant, and (b) the optimal Fourier multiplier was constructed using modular forms, which I would not have suspected to be relevant for a sphere packing problem...﻿

Famous math problem solved!

Ten days ago, Maryna Viasovska showed how to pack spheres in 8 dimensions as tightly as possible. In this arrangement the spheres occupy about 25.367% of the space. That looks like a strange number - but it's actually a wonderful number, as shown here.

People had guessed the answer to this problem for a long time. If you try to get as many equal-sized spheres to touch a sphere in 8 dimensions, there's exactly one way to do it - unlike in 3 dimensions, where there's a lot of wiggle room! And if you keep doing this, on and on, you're forced into a unique arrangement, called the E8 lattice. So this pattern is an obvious candidate for the densest sphere packing in 8 dimensions. But none of this proves it's the best!

In 2001, Henry Cohn and Noam Elkies showed that no sphere packing in 8 dimensions could be more than 1.000001 times as dense than E8. Close... but no cigar.

Now Maryna Viasovska has used the same technique, but pushed it further. Now we know: nothing can beat E8 in 8 dimensions!

Viasovska is an expert on the math of "modular forms", and that's what she used to crack this problem. But when she's not working on modular forms, she writes papers on physics! Serious stuff, like "Symmetry and disorder of the vitreous vortex lattice in an overdoped BaFe_{2-x}Co_x As_2 superconductor."

After coming up with her new ideas, Viaskovska teamed up with other experts including Henry Cohn and proved that another lattice, the Leech lattice, gives the densest sphere packing in 24 dimensions.

Different dimensions have very different personalities. Dimensions 8 and 24 are special. You may have heard that string theory works best in 10 and 26 dimensions - two more than 8 and 24. That's not a coincidence.

The densest sphere packings of spheres are only known in dimensions 0, 1, 2, 3, and now 8 and 24. Good candidates are known in many other low dimensions: the problem is proving things - and in particular, ruling out the huge unruly mob of non-lattice packings.

For example, in 3 dimensions there are uncountably many non-periodic packings of spheres that are just as dense as the densest lattice packing!

In fact, the sphere packing problem is harder in 3 dimensions than 8. It was only solved earlier because it was more famous, and one man - Thomas Hales - had the nearly insane persistence required to crack it.

His original proof was 250 pages long, together with 3 gigabytes of computer programs, data and results. He subsequently verified it using a computerized proof assistant, in a project that required 12 years and many people.

By contrast, Viasovska's proof is extremely elegant. It boils down to finding a function whose Fourier transform has a simple and surprising property! For details on that, try my blog article:

https://golem.ph.utexas.edu/category/2016/03/e8_is_the_best.html

#geometry

﻿
160
27

Can you imagine 8 dimensions? What is it like to live in 8 dimensions? 3 - space, 4 - spacetime, 5 - alternate universes, but 6, 7, 8 ?﻿
Work
Occupation
Mathematician
Employment
• UCLA
Mathematician, present
Places
Currently
Los Angeles
Previously