### David Wakeham

Shared publicly -Some disturbing industrial choral from the Silent Hilliard Ensemble.

1

2 comments

Simon Lilburn

+

1

2

1

2

1

Miseryre

Add a comment...

Start a hangout

David Wakeham

30 followers|104,393 views

AboutPostsPhotosVideos

Some disturbing industrial choral from the Silent Hilliard Ensemble.

1

2 comments

Simon Lilburn

+

1

2

1

2

1

Miseryre

Add a comment...

Very exciting stuff!

It's taken me longer than planned to explain the 'firewall paradox' for black holes. The reason is: I don't understand it. But that could be a good thing! You're not really supposed to 'understand' a paradox. It's supposed to not make sense... until you spot the incorrect assumptions that created the problem.

So I won't explain the firewall paradox; I'll just quote Sean Carroll's quick explanation:

**Here’s my attempt to squeeze the firewall argument down to its essence, for people who know a little quantum mechanics. If information escapes from a black hole, the radiation emitted at late times must share quantum entanglement with radiation that escaped at early times, in order to describe a pure quantum state (from which the black hole presumably formed). At the same time, to an observer near the event horizon, the local conditions are supposed to look almost like empty space — the quantum vacuum. But within that vacuum are virtual particles, some of which will eventually escape in the form of radiation and some of which will eventually fall into the black hole. In order for the state near the horizon to look like the vacuum, that outgoing radiation and the ingoing radiation must also be entangled. Therefore, it appears that the outgoing radiation is both entangled with the ingoing radiation, and with the radiation that escaped at earlier times. But that’s impossible; quantum mechanics won’t let degrees of freedom be separately (maximally) entangled with two different other sets of degrees of freedom. Entanglement is monogamous. A simple — but unpalatable — way out is to suggest that the state near the horizon is not a quiet state of maximal entanglement, but a noisy thermal state of high-energy radiation — a firewall.**

By this he means a lot of high-energy radiation*as seen by an infalling observer*. We already believe an observer hovering a little bit over the black hole horizon, thrusters blasting to keep from falling in, will see a lot of Hawking radiation - radiation that looks dim and faint to observers further away. But the usual story is that an observer *freely falling through the horizon* sees no Hawking radiation: just a vacuum. The firewall paradox claims that can't be true. It's a 'paradox' if you believe, as general relativity claims, that a freely falling observer should not see anything special as they pass through an event horizon.

Okay. Now I need to think about this some more! In the meantime, if you've never thought about this stuff before, I suggest reading Sean Carroll's post, which I just quoted:

http://www.preposterousuniverse.com/blog/2012/12/21/firewalls/

and then Joe Polchinski's post on the same blog:

http://www.preposterousuniverse.com/blog/2012/09/27/guest-post-joe-polchinski-on-black-holes-complementarity-and-firewalls/

#quantum_gravity #blackholes

So I won't explain the firewall paradox; I'll just quote Sean Carroll's quick explanation:

By this he means a lot of high-energy radiation

Okay. Now I need to think about this some more! In the meantime, if you've never thought about this stuff before, I suggest reading Sean Carroll's post, which I just quoted:

http://www.preposterousuniverse.com/blog/2012/12/21/firewalls/

and then Joe Polchinski's post on the same blog:

http://www.preposterousuniverse.com/blog/2012/09/27/guest-post-joe-polchinski-on-black-holes-complementarity-and-firewalls/

#quantum_gravity #blackholes

1

Add a comment...

A great review article about simulating the ignition mechanism for Type Ia supernovae, which is still poorly understood. They think a process called *gravitationally confined detonation* (GCD) may be responsible. Basically, an ash-filled bubble of carbon forms in the interior of a white dwarf. Buoyancy forces the bubble towards the surface, and turbulence sculpts the ash into a weird-looking spear which punches through the outer layer. Gravity traps some of the resulting ash plume. This trapped ash spreads radially around the star then collides at the antipodal point, causing a temperature spike big enough to spark the supernova.

There seem to be some interesting lessons here about interaction at different scales and generalising numerically stiff problems. Along with cool things like molecular dynamics and complex network-type stuff, it makes me wonder why "big computation" never got the same kind of wide-eyed pop-sci press as "big data". Also: the simulations produced some amazing images!

There seem to be some interesting lessons here about interaction at different scales and generalising numerically stiff problems. Along with cool things like molecular dynamics and complex network-type stuff, it makes me wonder why "big computation" never got the same kind of wide-eyed pop-sci press as "big data". Also: the simulations produced some amazing images!

1

Add a comment...

"I would love to have media sources cite the research articles they cover, and have those articles available to scrutinise. This would of course reduce the impact of headlines reading 'Y cures Z' if, with just a click, you could see that, 'Y mildly correlates with Z (r=0.35), in mice (n=6), under lab-conditions'."

From Jonathan Carroll's article on open-access journals, linked below.

From Jonathan Carroll's article on open-access journals, linked below.

1

3 comments

Thanks, that was pretty interesting too (and a lot of the other posts on that blog!) I read Michael Eisen's blog on this usually - he's one of the co-founders of PLoS. He's absolutely clueless on ecology, but very interesting on how the peer review system works.

http://www.michaeleisen.org/blog/ (e.g. http://www.michaeleisen.org/blog/?p=1079)

http://www.michaeleisen.org/blog/ (e.g. http://www.michaeleisen.org/blog/?p=1079)

Add a comment...

Site devoted to *haikyo* (廃墟), or abandoned Japanese buildings.

1

Add a comment...

A list of unsuitable thesis topics in physics, aka "lost causes", compiled by R. F. Streater:

http://www.mth.kcl.ac.uk/~streater/lostcauses.html

A complementary list of "good causes":

http://www.mth.kcl.ac.uk/~streater/regainedcauses.html

http://www.mth.kcl.ac.uk/~streater/lostcauses.html

A complementary list of "good causes":

http://www.mth.kcl.ac.uk/~streater/regainedcauses.html

1

Lost cause XVII is *To convert R. Penrose to the Copenhagen view*. The author comments "No chance; I've tried."

Add a comment...

Mozart actually composed a variant of his Serenade No. 10 for a variable number of bassoons; it was described by a "Gran Partita function".

1

I feel like I can tentatively groan at this joke, but only more completely after I did a bit of wikipedia'ing.

Add a comment...

Listened to *Tomorrow's Harvest* (the new *Boards of Canada* release) about 12 times today. It's great! *Reach for the Dead* is a great single with an instagram-meets-Carlos Castaneda film clip.

1

Add a comment...

Apparently, the "P. A. M." in "P. A. M. Dirac" stood for "Poincare Aloysius Mussolini".

An interview with Dirac. Professor Michael Keissling, a faculty member in the Rutgers Department of Mathematics whose field of study is mathematical physics, kindly sent me a transcript of an actual i...

1

Add a comment...

Oxford number theorist Minhyong Kim gives some insight into the abc conjecture and Mochizuki's purported proof (via John Baez):

Here are some comments by Minhyong Kim, an expert on number theory at the University of Oxford, on Mochizuki's attempt to prove the abc conjecture. He starts with a simple intuitive explanation of the idea behind the conjecture, and then says a bit about Mochizuki's approach and the broader context.

------------

1. Why is ABC so connected to other elements of number theory? What does this proof mean for number theory, if it proves true?

The ABC conjecture is concerned with the equation

A+B=C

where A,B,C are integers having no common factors. For example,

1+2=3

9+5=14

and so on.

The conjecture says roughly that if there are prime numbers that divide either A or B too many times, than their presence has to be `balanced out' by largish primes that divide C only a few times. For example, consider

81+64=145

We see 3 divides 81 four times and 2 divides 64 four times. But then, 145=5 × 29, so you get the larger primes 5 and 29 dividing 145 just once.

For another example, consider

1024+1=1025.

Here, 2 divides 1024 ten times. But 1025=5x5x41, so you see 5 only divides it twice and the prime 41, a good deal larger than 2 or 5, divides it only once.

These are obviously very basic considerations. It's not an uncommon phenomenon for a basic question, when asked in the right way, to yield many consequences for mathematics. One way of thinking about the ABC conjecture is that it probes in a very fundamental way the tension between the two arithmetic operations, addition and multiplication. Divisibility and primes are multiplicative notions. The ABC conjecture sets a subtle and rather precise limit on how these notions can interact with addition. Since so much of number theory builds on the relation between addition and multiplication, it's not surprising that understanding such constraints might be useful in many ways!

2. In particular, what is the connection with Fermat’s last theorem? I’ve read that ABC makes it trivial to prove FLT – how so? And could this trivial proof be the one Fermat himself had in mind?

The ABC conjecture can be formulated with varying degrees of precision. The more precise versions will yield a quick proof of FLT quite different from that of Wiles, but only for sufficiently large exponents. That is, FLT says that

X^n+Y^n=Z^n

has only the obvious solutions for n at least 3. Precise ABC will give the same statement, but only for n larger than some number M that can be computed in principle. Depending on how precise the ABC statement is, you might get the statement

X^n+Y^n=Z^n has only the obvious solutions for n bigger than one million

or something like that. So you don't get the whole thing, but it's still remarkable.

The implication

ABC implies FLT for large exponents

certainly could have been known to Fermat. That is, this is elementary, and anyone in Fermat's time had the conceptual machinery necessary to come up with this implication. However, Mochizuki's current approach to prove ABC itself is ultra-modern in nature.

3. Also, does proving ABC provide the first independent (from Wiles's) proof of FLT, or have there been others?

This is true. Even for sufficiently large exponents, there were no other proofs in the interim. In fact, the situation was quite unusual, for a celebrated theorem not to have received an alternative proof almost two decades after its resolution. I suppose this is testimony to the depths of Wiles's original insight.

4. Is this proof in a form that mathematicians expected? My understanding is that Mochizuki is working quite far out from others – will that make validating the proof difficult? What happens next?

There are elements of Mochizuki's strategy that were to be expected. For example, the proof goes via a circle of conjectures due to Paul Vojta and Lucien Szpiro about elliptic curves, which, in fact, could be vaguely reminiscent of Wiles's approach. On the other hand, the number of completely radical new ideas far outnumber familiar notions.

It's not quite true that Mochizuki is working 'far out' in a strict sense. He is a very mainstream mathematician with a thorough grounding in arithmetic geometry and many deep prior results to his credit. This is what makes good mathematicians take his claims very seriously, in spite of the unusual nature of the machinery he has developed. How long it will take for people to evaluate the work, it's hard to say, possibly even a year or so. Among other difficulties, his work probes the very core of mathematical language such as what we might really mean by a number or a geometric figure, and how they might be interpreted in a manner quite different from usual conventions. In fact, it relies on deep relations of a geometric nature between such varying interpretations. Such questions have occupied philosophers for millennia, but are usually quite distant from the consciousness of modern mathematicians. But then, these seemingly philosophical questions have to be recast in the robust language of precise mathematics. You have to add to that some of the most sophisticated portions of 21st century arithmetic geometry.

At the moment, I can fairly safely say that there is no one but the author who is familiar with all these things. Possibly his colleague Akio Tamagawa. The obvious next step is for Mochizuki to submit his papers, totaling around 500 densely-packed pages (based on perhaps another 1000 pages of previous technical papers), for publication in a reputable journal. Then the usual process of vetting and probing will begin in earnest.

5. Can you give any kind of lay summary of the proof? I appreciate that's a very difficult question - even an indication of how Mochizuki has connected different areas of maths, for example, would be useful.

I can't even give an expert summary of the proof because I don't understand it! Whatever glimmer of knowledge I have is based on long conversations with Mochizuki that took place about about 5 years ago. My impression is his programme has evolved considerably since then.

In addition to what I wrote in the other portions, perhaps it may be picturesque to remark that there has been an idea for some time that it would be nice to be able to 'deform' usual numbers in an arithmetically natural way. It has been vaguely understood that constructing the right kind of deformations of numbers would have consequences for many parts of number theory, including ABC as well as the Riemann hypothesis. There are other people working on such speculative notions, sometimes collectively referred to as the 'theory of the field with one element.' It might not be entirely inaccurate to say that Mochizuki manages to construct such a deformation, but only by departing drastically from the usual mathematical universe as we know it.

Described with a bit of poetic license, he is literally taking apart conventional objects in terrible ways and reconstructing them in new universes. This part, I think I have some feeling for. How he manages to come back to the usual universe in a way that yields concrete consequences for number theory, I really have no idea as yet.

6. The proof already seems to be causing a lot of buzz online, could you give some sense of how important/exciting this is for mathematicians?

If the proof turns out to be accepted, the significance is enormous. For one thing, there was another problem of deep concern to number-theorists called 'the effective Mordell problem.' Here you considered an algebraic equation like

f(x,y)=0

where f(x,y) is a polynomial of degree at least 4 in two unknowns. In the 1980's Gerd Faltings (Mochizuki's supervisor at Princeton) proved the so-called 'Mordell conjecture,' which says such an equation usually has only finitely many solutions in rational numbers x and y. (Here `usually' has a precise sense, which I'll not explain.) This was a big advance in number theory. However, there was a snag: We had no procedure, or algorithm, for actually finding this finite set of solutions! It could be done for special equations, or special classes of equations. For example, the theorem of Wiles implies that

x^n+y^n=1

for n at least 4 has only the solutions (1,0), (0,1) if n is odd to which (-1,0), (0,-1) has to be added if n is even. However, we had no way of listing such solutions in general. That is, we would like a uniform procedure that allows us to be handed any polynomial f(x,y), do some routine computations with it requiring no ingenuity, at the end of which we have the complete list of rational solutions:

Input[f(x,y)] → [[Effective Mordell machine]] → Output[ complete list of rational solutions]

To come up with such a procedure was the effective Mordell problem. The most important immediate consequence of Mochizuki's result, if correct and sufficiently precise, will be a solution to this problem. That is, it will give us an Effective Mordell Machine.

Now, there are a number of other weighty mathematical consequences of ABC that I will not go into here. But the true significance, I think, will go considerably beyond specific problems, that is, beyond ABC itself. Mochizuki's method, if it's found acceptable to the mathematical community, is likely to yield a completely new way of thinking about numbers, figures, and other mathematical objects.

Alternative foundations for mathematics is an idea that's somewhat in the air in our times even for working mathematicians not particularly close to philosophy, partly as a result of new notions of space and time demanded by a certain kind of mathematical physics. Vladimir Voevodsky at the Institute for Advanced Study in Princeton comes to mind as well as Jacob Lurie at Harvard.

There were also several mysterious geometries in the 1980's outlined by the visionary mathematician Alexander Grothendieck, who had a tremendous influence on Mochizuki. However, such radical thinking that has immediate implications for everyday mathematics in the way Mochizuki is proposing will gather attention and have influence much more quickly than any amount of general theory.

To an extent, building theories really is the key activity of the mathematical community because a good theory is what gives us some kind of a general insight into the way mathematics works. Difficult problems, like FLT, the ABC conjecture, or the Riemann hypothesis, are analogous to the experiments performed by physicists in their labs or accelerators.

When a deep theory in pure mathematics can be used to resolve a problem that's well-known to be difficult, it gives us confidence that the theory is worth our while. In this way, Mochizuki's proof of ABC, if found acceptable, will convince people that this incredible (both in the positive and slightly negative literal sense) edifice he's been building up over the last twenty years or so is actually something solid, a well-grounded launching pad for further exploratory journeys into the mathematical universe.

------------

1. Why is ABC so connected to other elements of number theory? What does this proof mean for number theory, if it proves true?

The ABC conjecture is concerned with the equation

A+B=C

where A,B,C are integers having no common factors. For example,

1+2=3

9+5=14

and so on.

The conjecture says roughly that if there are prime numbers that divide either A or B too many times, than their presence has to be `balanced out' by largish primes that divide C only a few times. For example, consider

81+64=145

We see 3 divides 81 four times and 2 divides 64 four times. But then, 145=5 × 29, so you get the larger primes 5 and 29 dividing 145 just once.

For another example, consider

1024+1=1025.

Here, 2 divides 1024 ten times. But 1025=5x5x41, so you see 5 only divides it twice and the prime 41, a good deal larger than 2 or 5, divides it only once.

These are obviously very basic considerations. It's not an uncommon phenomenon for a basic question, when asked in the right way, to yield many consequences for mathematics. One way of thinking about the ABC conjecture is that it probes in a very fundamental way the tension between the two arithmetic operations, addition and multiplication. Divisibility and primes are multiplicative notions. The ABC conjecture sets a subtle and rather precise limit on how these notions can interact with addition. Since so much of number theory builds on the relation between addition and multiplication, it's not surprising that understanding such constraints might be useful in many ways!

2. In particular, what is the connection with Fermat’s last theorem? I’ve read that ABC makes it trivial to prove FLT – how so? And could this trivial proof be the one Fermat himself had in mind?

The ABC conjecture can be formulated with varying degrees of precision. The more precise versions will yield a quick proof of FLT quite different from that of Wiles, but only for sufficiently large exponents. That is, FLT says that

X^n+Y^n=Z^n

has only the obvious solutions for n at least 3. Precise ABC will give the same statement, but only for n larger than some number M that can be computed in principle. Depending on how precise the ABC statement is, you might get the statement

X^n+Y^n=Z^n has only the obvious solutions for n bigger than one million

or something like that. So you don't get the whole thing, but it's still remarkable.

The implication

ABC implies FLT for large exponents

certainly could have been known to Fermat. That is, this is elementary, and anyone in Fermat's time had the conceptual machinery necessary to come up with this implication. However, Mochizuki's current approach to prove ABC itself is ultra-modern in nature.

3. Also, does proving ABC provide the first independent (from Wiles's) proof of FLT, or have there been others?

This is true. Even for sufficiently large exponents, there were no other proofs in the interim. In fact, the situation was quite unusual, for a celebrated theorem not to have received an alternative proof almost two decades after its resolution. I suppose this is testimony to the depths of Wiles's original insight.

4. Is this proof in a form that mathematicians expected? My understanding is that Mochizuki is working quite far out from others – will that make validating the proof difficult? What happens next?

There are elements of Mochizuki's strategy that were to be expected. For example, the proof goes via a circle of conjectures due to Paul Vojta and Lucien Szpiro about elliptic curves, which, in fact, could be vaguely reminiscent of Wiles's approach. On the other hand, the number of completely radical new ideas far outnumber familiar notions.

It's not quite true that Mochizuki is working 'far out' in a strict sense. He is a very mainstream mathematician with a thorough grounding in arithmetic geometry and many deep prior results to his credit. This is what makes good mathematicians take his claims very seriously, in spite of the unusual nature of the machinery he has developed. How long it will take for people to evaluate the work, it's hard to say, possibly even a year or so. Among other difficulties, his work probes the very core of mathematical language such as what we might really mean by a number or a geometric figure, and how they might be interpreted in a manner quite different from usual conventions. In fact, it relies on deep relations of a geometric nature between such varying interpretations. Such questions have occupied philosophers for millennia, but are usually quite distant from the consciousness of modern mathematicians. But then, these seemingly philosophical questions have to be recast in the robust language of precise mathematics. You have to add to that some of the most sophisticated portions of 21st century arithmetic geometry.

At the moment, I can fairly safely say that there is no one but the author who is familiar with all these things. Possibly his colleague Akio Tamagawa. The obvious next step is for Mochizuki to submit his papers, totaling around 500 densely-packed pages (based on perhaps another 1000 pages of previous technical papers), for publication in a reputable journal. Then the usual process of vetting and probing will begin in earnest.

5. Can you give any kind of lay summary of the proof? I appreciate that's a very difficult question - even an indication of how Mochizuki has connected different areas of maths, for example, would be useful.

I can't even give an expert summary of the proof because I don't understand it! Whatever glimmer of knowledge I have is based on long conversations with Mochizuki that took place about about 5 years ago. My impression is his programme has evolved considerably since then.

In addition to what I wrote in the other portions, perhaps it may be picturesque to remark that there has been an idea for some time that it would be nice to be able to 'deform' usual numbers in an arithmetically natural way. It has been vaguely understood that constructing the right kind of deformations of numbers would have consequences for many parts of number theory, including ABC as well as the Riemann hypothesis. There are other people working on such speculative notions, sometimes collectively referred to as the 'theory of the field with one element.' It might not be entirely inaccurate to say that Mochizuki manages to construct such a deformation, but only by departing drastically from the usual mathematical universe as we know it.

Described with a bit of poetic license, he is literally taking apart conventional objects in terrible ways and reconstructing them in new universes. This part, I think I have some feeling for. How he manages to come back to the usual universe in a way that yields concrete consequences for number theory, I really have no idea as yet.

6. The proof already seems to be causing a lot of buzz online, could you give some sense of how important/exciting this is for mathematicians?

If the proof turns out to be accepted, the significance is enormous. For one thing, there was another problem of deep concern to number-theorists called 'the effective Mordell problem.' Here you considered an algebraic equation like

f(x,y)=0

where f(x,y) is a polynomial of degree at least 4 in two unknowns. In the 1980's Gerd Faltings (Mochizuki's supervisor at Princeton) proved the so-called 'Mordell conjecture,' which says such an equation usually has only finitely many solutions in rational numbers x and y. (Here `usually' has a precise sense, which I'll not explain.) This was a big advance in number theory. However, there was a snag: We had no procedure, or algorithm, for actually finding this finite set of solutions! It could be done for special equations, or special classes of equations. For example, the theorem of Wiles implies that

x^n+y^n=1

for n at least 4 has only the solutions (1,0), (0,1) if n is odd to which (-1,0), (0,-1) has to be added if n is even. However, we had no way of listing such solutions in general. That is, we would like a uniform procedure that allows us to be handed any polynomial f(x,y), do some routine computations with it requiring no ingenuity, at the end of which we have the complete list of rational solutions:

Input[f(x,y)] → [[Effective Mordell machine]] → Output[ complete list of rational solutions]

To come up with such a procedure was the effective Mordell problem. The most important immediate consequence of Mochizuki's result, if correct and sufficiently precise, will be a solution to this problem. That is, it will give us an Effective Mordell Machine.

Now, there are a number of other weighty mathematical consequences of ABC that I will not go into here. But the true significance, I think, will go considerably beyond specific problems, that is, beyond ABC itself. Mochizuki's method, if it's found acceptable to the mathematical community, is likely to yield a completely new way of thinking about numbers, figures, and other mathematical objects.

Alternative foundations for mathematics is an idea that's somewhat in the air in our times even for working mathematicians not particularly close to philosophy, partly as a result of new notions of space and time demanded by a certain kind of mathematical physics. Vladimir Voevodsky at the Institute for Advanced Study in Princeton comes to mind as well as Jacob Lurie at Harvard.

There were also several mysterious geometries in the 1980's outlined by the visionary mathematician Alexander Grothendieck, who had a tremendous influence on Mochizuki. However, such radical thinking that has immediate implications for everyday mathematics in the way Mochizuki is proposing will gather attention and have influence much more quickly than any amount of general theory.

To an extent, building theories really is the key activity of the mathematical community because a good theory is what gives us some kind of a general insight into the way mathematics works. Difficult problems, like FLT, the ABC conjecture, or the Riemann hypothesis, are analogous to the experiments performed by physicists in their labs or accelerators.

When a deep theory in pure mathematics can be used to resolve a problem that's well-known to be difficult, it gives us confidence that the theory is worth our while. In this way, Mochizuki's proof of ABC, if found acceptable, will convince people that this incredible (both in the positive and slightly negative literal sense) edifice he's been building up over the last twenty years or so is actually something solid, a well-grounded launching pad for further exploratory journeys into the mathematical universe.

1

A more technical discussion (by Kim and others) at MathOverflow:

http://mathoverflow.net/questions/106560/what-is-the-underlying-vision-that-mochizuki-pursued-when-trying-to-prove-the-abc/

http://mathoverflow.net/questions/106560/what-is-the-underlying-vision-that-mochizuki-pursued-when-trying-to-prove-the-abc/

Add a comment...

Official site of the disturbing but very talented Polish painter Zdzisław Beksiński. John Martin meets 70's sci-fi cover art.

2

Add a comment...

Not exactly reverse mathematics, but an interesting exercise in model theory.

When we first study analysis rigorously we usually start with the ordered field axioms and then add some kind of completeness axioms to assert that there are no 'holes' in the real line. After that we can go off and prove all the usual theorems of analysis.

But this works in reverse too. If we start with the ordered field axioms it turns out that many well known theorems of analysis imply the completeness of the real line. This includes theorems that don't really look like assertions of completeness.

Propp gives a list of lots of theorems with this property. But just for fun he's also listed a bunch that*don't* and makes it a puzzle to work out which is which.

I'm hopeless at analysis and don't expect to do well at sorting out which is which. I'm sure you can do much better.

But Propp's goal isn't just to list a bunch of puzzles. He has philosophical and pedagogical points to make. The philosophical point is that because so many theorems imply completeness, anything that looks vaguely like analysis*is* analysis. So, in some sense, you can't make small deformations to analysis.

Needless to say, this all assumes classical logic. Many of these theorems won't be equivalent when using intuitionistic logic.

But this works in reverse too. If we start with the ordered field axioms it turns out that many well known theorems of analysis imply the completeness of the real line. This includes theorems that don't really look like assertions of completeness.

Propp gives a list of lots of theorems with this property. But just for fun he's also listed a bunch that

I'm hopeless at analysis and don't expect to do well at sorting out which is which. I'm sure you can do much better.

But Propp's goal isn't just to list a bunch of puzzles. He has philosophical and pedagogical points to make. The philosophical point is that because so many theorems imply completeness, anything that looks vaguely like analysis

Needless to say, this all assumes classical logic. Many of these theorems won't be equivalent when using intuitionistic logic.

Abstract: Many of the theorems of real analysis, against the background of the ordered field axioms, are equivalent to Dedekind completeness, and hence can serve as completeness axioms for the reals. ...

1

Add a comment...

Basic Information

Gender

Decline to State

Links

Contributor to

- Labyrinths (current)
- The Ruined Star (current)