Profile

Cover photo
Terence Tao
Works at UCLA
Attended Princeton University
Lives in Los Angeles
31,600 followers|3,858,210 views
AboutPostsPhotosVideos

Stream

Terence Tao

Shared publicly  - 
 
When I became naturalised as a US citizen in 2009, I received at the ceremony a little bag containing (among other things) a small US flag and a pocket copy of the US constitution. A few years ago, while doing some housecleaning, I decided to discard the copy of the constitution, reasoning that the text was easily available online whenever I needed to read it. (Also, I had previously studied the constitution in order to pass the citizenship exam.)

... I think I now regret that decision.
66
3
Daniel Povey's profile photoFrancois Pris's profile photoashok mishra's profile photoRway Ado's profile photo
17 comments
 
I don't think it too late you can get it back if you need one
Add a comment...

Terence Tao

Shared publicly  - 
 
Correlation does not imply causation; nevertheless, this is quite a striking correlation between trust in experts and Brexit voting intentions (from June 16). (Note: the actual data http://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/x4iynd1mn7/TodayResults_160614_EUReferendum_W.pdf has a more refined breakdown depending on what type of "expert" is being considered, but the correlation persists across types.)
81
26
Annie Oakley's profile photoCheesy Mac's profile photoCraig Falls's profile photoNiraj Sharma's profile photo
38 comments
 
All the best


Add a comment...

Terence Tao

Shared publicly  - 
 
Maths moves at a faster pace these days... only eight days after the breakthrough of Croot, Lev, and Pach obtaining for the first time an exponentially strong upper bound on the cap set problem in (Z/4Z)^n, Jordan Ellenberg has adapted the argument to the original setting of (Z/3Z)^n; we now know that the smallest capset in this space has size between 2.2174^n and 2.756^n asymptotically. (I blogged about the cap set problem in one of my first blog posts way back in 2007: https://terrytao.wordpress.com/2007/02/23/open-question-best-bounds-for-cap-sets/ .) The argument looks like it will also extend to other finite fields than Z/3Z.

Like Dvir's argument solving the finite field Kakeya problem, it ends up being a very short (3 pages) application of the polynomial method. Basically, the point is that if A is a capset (avoiding 3-term progressions), and P(x) is some polynomial that vanishes outside of -A but is nonzero for many values of -A, then the polynomial P(x+y) vanishes on all of A x A except on the diagonal, where it has a lot of nonzeroes. But this forces the rank of the matrix P(x+y) for x,y in A to be large, which in turn forces the degree of P to be large. The usual linear algebra and dimension counting of the polynomial method then finishes off the proof.

The original problem of improving the bounds in Roth's theorem remains open (much as the Euclidean Kakeya problem does), but it at least gives hope that the current logarithmic-type upper bounds for Roth's theorem (based on Fourier methods) can be improved, given that the Fourier methods have finally been dethroned from their long-held title as claimants to the strongest quantitative bounds in the finite field setting.
77
19
Add a comment...

Terence Tao

Shared publicly  - 
 
Among the many nice things I liked about this result: (a) it uses Fourier-analytic techniques to obtain a sharp constant, and (b) the optimal Fourier multiplier was constructed using modular forms, which I would not have suspected to be relevant for a sphere packing problem...
 
Famous math problem solved!

Ten days ago, Maryna Viasovska showed how to pack spheres in 8 dimensions as tightly as possible. In this arrangement the spheres occupy about 25.367% of the space. That looks like a strange number - but it's actually a wonderful number, as shown here.

People had guessed the answer to this problem for a long time. If you try to get as many equal-sized spheres to touch a sphere in 8 dimensions, there's exactly one way to do it - unlike in 3 dimensions, where there's a lot of wiggle room! And if you keep doing this, on and on, you're forced into a unique arrangement, called the E8 lattice. So this pattern is an obvious candidate for the densest sphere packing in 8 dimensions. But none of this proves it's the best!

In 2001, Henry Cohn and Noam Elkies showed that no sphere packing in 8 dimensions could be more than 1.000001 times as dense than E8. Close... but no cigar.

Now Maryna Viasovska has used the same technique, but pushed it further. Now we know: nothing can beat E8 in 8 dimensions!

Viasovska is an expert on the math of "modular forms", and that's what she used to crack this problem. But when she's not working on modular forms, she writes papers on physics! Serious stuff, like "Symmetry and disorder of the vitreous vortex lattice in an overdoped BaFe_{2-x}Co_x As_2 superconductor."

After coming up with her new ideas, Viaskovska teamed up with other experts including Henry Cohn and proved that another lattice, the Leech lattice, gives the densest sphere packing in 24 dimensions.

Different dimensions have very different personalities. Dimensions 8 and 24 are special. You may have heard that string theory works best in 10 and 26 dimensions - two more than 8 and 24. That's not a coincidence.

The densest sphere packings of spheres are only known in dimensions 0, 1, 2, 3, and now 8 and 24. Good candidates are known in many other low dimensions: the problem is proving things - and in particular, ruling out the huge unruly mob of non-lattice packings.

For example, in 3 dimensions there are uncountably many non-periodic packings of spheres that are just as dense as the densest lattice packing!

In fact, the sphere packing problem is harder in 3 dimensions than 8. It was only solved earlier because it was more famous, and one man - Thomas Hales - had the nearly insane persistence required to crack it.

His original proof was 250 pages long, together with 3 gigabytes of computer programs, data and results. He subsequently verified it using a computerized proof assistant, in a project that required 12 years and many people.

By contrast, Viasovska's proof is extremely elegant. It boils down to finding a function whose Fourier transform has a simple and surprising property! For details on that, try my blog article:

https://golem.ph.utexas.edu/category/2016/03/e8_is_the_best.html

#geometry


43 comments on original post
149
22
ashok mishra's profile photoRucha Shukla's profile photoBebiter “Josephine” Mbabazi's profile photoJochen Fromm's profile photo
10 comments
 
Can you imagine 8 dimensions? What is it like to live in 8 dimensions? 3 - space, 4 - spacetime, 5 - alternate universes, but 6, 7, 8 ?
Add a comment...

Terence Tao

Shared publicly  - 
 
The upcoming conference on May 21-24 at the IAS in honour of Jean Bourgain's many contributions to mathematics is still open for registration, and has some funds to support graduate students and postdocs attending (with applications from under-represented minorities particularly welcome).  [I am one of the organisers of this event.]
The School of Mathematics wishes to acknowledge and thank the following for their generous support of the School's programs each year: National Science Foundation. Copyright ©2011. School of Mathematics Institute for Advanced Study Einstein Drive, Princeton, NJ 08540 USA ...
36
5
Add a comment...

Terence Tao

Shared publicly  - 
 
The pedagogy here is unconventional, but surprisingly effective.
147
42
Joanne Manaster's profile photoHadjer Menni's profile photoStephen Yuen's profile photoKriti Shrivastava's profile photo
5 comments
 
this was amazing
Add a comment...
Have him in circles
31,600 people
hao fang's profile photo
Thao Pham's profile photo
mike ryoma's profile photo
Rob Thew's profile photo
semkath's profile photo
Sean Bennett's profile photo
Pablo Bleyer's profile photo
Aaron Dunne's profile photo
Rajwinder Singh's profile photo

Terence Tao

Shared publicly  - 
 
There is a classic paper of Shannon from 1950 (linked below) in which he estimates the entropy of the English language to be about 1 bit per character. Roughly speaking, this means that for medium sized n (e.g. 50 to 100), the distribution of n-grams - strings of n consecutive letters in an English text, after stripping punctuation and spacing) - is concentrated in a set of size about 2^n (this can be compared with the number of all possible n-grams using the Roman alphabet, which is 26^n). In particular, any two randomly selected n-grams from two different English texts should have a probability of about 2^{-n} of agreeing with each other from purely random chance.

As a consequence, given two English texts of character length N and M, the probability that they share a matching n-gram purely from random chance is approximately N M 2^{-n} (assuming that n is large enough that this product is significantly less than 1, but still small compared to N and M). One can view this quantity NM 2^{-n} as a crude measure of the "p-value" when testing for the hypothesis that a match is occurring for a reason other than random chance.

It is tempting to apply this formula as a test for plagiarism, when a matching n-gram is found between two texts (as was recently the case involving the convention speeches of Melania Trump and Michelle Obama). One can take N to be the number of characters of the text to be tested for plagiarism (in the case of Trump's speech, this is about 2^{12.5}). It would however be unfair to take M to be the length of the text in which a match was found (in this case, Obama's speech, which is about 2^13 characters), because this is not the only reference text in which would be available for plagiarism (this is a variant of the "Texas sharpshooter fallacy"). For maximum benefit of the doubt, one could take M to be something like the total number of characters of English text on pages indexed by a major search engine; this is a bit difficult to estimate, but it seems that there about 10^9 to 10^10 English web pages indexed by (say) Google, which on average contain about 10^3 to 10^4 characters of text, so one could very roughly take M to be about 10^13, or about 2^43. (This is probably an overestimate due to redundancy of text on the web, as well as the unsuitability of much of this corpus for the purpose of using in a political speech, but I have no way to easily estimate the level of such redundancy or unsuitability.)

Using this corpus, we can then estimate the maximum n-gram match one should have between a speech such as Trump's or Obama's and the entirety of the indexed internet to be about 65-70, with the p-value of a match decreasing exponentially by roughly a factor of two for every additional character beyond this range.

In the case of Trump's speech, it turns out that the longest n-gram match between Trump and Obama is 81 characters long ("values that you work hard for what you want in life your word is your bond and you do what you say", minus the spaces). This suggests a p-value of at most 0.001. But this is an extremely rough estimate, given the large margins of error in the quantities used above, though it is probably on the conservative side as it is using a rather large corpus for M, and is also ignoring the other matching or nearly matching n-grams between the two texts.

There have been some attempts to counter the accusations of plagiarism by pointing out that the speech of Obama (or of other speakers) also contain matches with prior texts; however the matches are much shorter in length (e.g. the phrase "the world as it should be" appears both in the speech of Obama and in a speech of Saul Alinsky), and given the exponentially decaying nature of the p-value, such shorter matches are completely explainable through random chance. A matching n-gram that is even just 10 characters shorter than the one in Trump's speech, for instance, would be about 1000 times more likely to occur by random chance; one that is 20 characters shorter would be about one million times more likely to occur by chance; and so forth. This is certainly a situation in which the level of significance does not behave linearly with the length of the matching text!
101
26
Stuart Anderson's profile photoJohn Baez's profile photoDavid Roberts's profile photoashok mishra's profile photo
17 comments
 
it is beautious .i like this
Add a comment...

Terence Tao

Shared publicly  - 
 
An impressively huge lockdown here at UCLA.
Android users, if the above player does not work, please click here. Police are investigating a possible shooting at UCLA’s Engineering IV building Wednesday morning. The campus is on lockdown and…
32
4
Adelaide Dupont's profile photoLuqing Ye's profile photo
11 comments
 
+Adelaide Dupont Citizens are not allowed to own guns all over the country.Only police can.So some local governments are able to pull down people's house without fear,for example,last month a group of police in Hainan province pull down people's house by using forces,women and children are beaten,they cried in vain.But all in all,I think guns should be banned.
Add a comment...

Terence Tao

Shared publicly  - 
 
It's always been a bit of a mystery why the polynomial method, which has had many impressive successes in the incidence geometry of finite field vector spaces, has not been able to make progress on the capset problem (bounding the size of subsets of Z_3^n that are free of three-term arithmetic progressions). In this paper of Croot, Lev, and Pach, they manage to use the polynomial method to get a surprisingly strong bound for the analogous problem in Z_4^n (far surpassing the Fourier-analytic techniques which have long held the title to best bounds for these Roth theorem type results). Currently the method is restricted to the 4-torsion case, but it gives hope that some other deployment of the polynomial method could be effective for other groups, and particularly for the capset problem. #spnetwork arXiv:1605.01506
Selected Papers Network
Progression-free sets in Z_4^n are exponentially small. Ernie Croot, Vsevolod Lev, Peter Pach. We show that for integer $n>0$, any subset $A \subset Z_4^n$ free of three-term arithmetic progressions has size $|A|<2(\sqrt n+1) 4^{c n}$, with an absolute constant $c \approx 0.926$.
42
11
UTSAV DEWAN's profile photo
 
So amazing!!!
Add a comment...

Terence Tao

Shared publicly  - 
 
In case one likes a little bit of suspense, the Abel prize for this year will be announced about eight hours from now (Tue 11am GMT) on the linked website.  (I am not involved with the prize selection this year and have no idea who the recipient will be.)
45
7
Brian Ekanyu's profile photoAdelaide Dupont's profile photoashok mishra's profile photo
10 comments
 
congratulations to ANDREW WILES ,www.facebook.com/ashok.mishra.5201254,
Add a comment...

Terence Tao

Shared publicly  - 
 
 
The new journal Discrete Analysis has just been launched.

Points to note.

1. It is an arXiv overlay journal. That means that to submit a paper you just post it to the arXiv and tell us that that is what you have done. If we accept it, we suggest revisions, you post those, and that's it. That enables us to keep our costs about two orders of magnitude lower than those of a traditional journal. Papers are free to read (obviously) and we make no charge to authors.

2. Contrary to what one might expect, the website of the journal adds considerable value to the articles, by presenting them properly. Thanks to the work of Scholastica, the platform we chose for the journal, the site looks beautiful and is laid out extremely thoughtfully (this will become clearer when we have more articles). We have also provided "Editorial introductions" for each article, which aim to place it in context and help you judge whether you might be interested in reading it. If you open one of these, it does not take you to a different page, but opens up in a box on the existing page. (In general, we have tried to minimize the amount of loading you have to do.)

I have written more about it in a blog post:

   https://gowers.wordpress.com/2016/03/01/discrete-analysis-launched/
8 comments on original post
76
11
RAHUL P's profile photoNoSyba Makkah's profile photoSatyr Icon's profile photo
5 comments
 
So discrete I wasn't even aware of it.
Add a comment...

Terence Tao

Shared publicly  - 
 
 
Frankl's union-closed conjecture is the following easy sounding statement. Let A_1,...,A_n be n subsets of a set X and suppose that for each i and j the union of A_i and A_j is one of the subsets (so the collection of sets is closed under taking unions). Then there is an element x of X that belongs to at least n/2 of the sets.

Seems simple? Then why not help me prove it? But I should mention, by way of a mild warning, that it is not even known whether there must be an element of X that belongs to at least n/1,000,000 of the sets (or indeed cn for any fixed constant c>0). But in a way that makes it easier to make progress -- even a statement that falls quite some way short of the conjecture would be an advance. 
6 comments on original post
36
4
akoudad youssef's profile photo
 
You have to prove it ...
It is a good miniproject for you.. i think ...
Add a comment...
People
Have him in circles
31,600 people
hao fang's profile photo
Thao Pham's profile photo
mike ryoma's profile photo
Rob Thew's profile photo
semkath's profile photo
Sean Bennett's profile photo
Pablo Bleyer's profile photo
Aaron Dunne's profile photo
Rajwinder Singh's profile photo
Education
  • Princeton University
    Mathematics, 1992 - 1996
  • Flinders University
    Mathematics, 1989 - 1992
Basic Information
Gender
Male
Work
Occupation
Mathematician
Employment
  • UCLA
    Mathematician, present
Places
Map of the places this user has livedMap of the places this user has livedMap of the places this user has lived
Currently
Los Angeles
Previously
Adelaide, Australia
Links
Other profiles