### John Baez

Shared publicly  -

How many bits of information could you fit in the whole universe?

Let's figure it out!  For starters, let's say we mean the observable universe.  The universe may be infinite in size.  But when we look back all the way to when the hot gas of the early universe first cooled down and became transparent, everything we see fits in a finite-sized ball centered at us.  That ball has by now expanded to be much larger.  This larger present-day ball is called the observable universe

Maybe it should be called the 'once-observed universe', since we can't see what distant galaxies are doing now.  But regardless of what it's called, this ball is about 9 × 10^26 meters in diameter.  How much information could we fit in it?

When you keep trying to stuff more information into some region, eventually you get a black hole.  The information becomes inaccessible, so then it's unknown information, also known as entropy.  The amount of entropy is proportional to the surface area of the black hole.  And this is just 4π times the black hole's radius squared.

(At least this is true if the black hole isn't rotating or charged... and it's sitting in otherwise flat space.   Luckily, while the universe is expanding, space at any moment seems close to flat.)

So let's work out the surface area of the observable universe, by taking 4π times its radius squared.  We get 9.5 × 10^54 square meters.

How much information fits onto this area?  It turns out that for a black hole, each nat of information takes an area of 4 times the Planck length squared.  A nat is a bit divided by the natural logarithm of 2.  Computer scientists like base 2, but black holes seem to like base e.

4 times the Planck length squared is 10^-69 square meters.   So, calculating a bit, we see the number of nats of information we can pack into the observable universe is roughly 10^124.   I hope you check my work!

But I know you prefer bits, so let's divide by the natural logarithm of 2.  We get: the most information we can fit into the observable universe is 1.4 × 10^124 bits.

Of course, doing this would require turning the observable universe into a black hole!  But if we did so, the black hole would have lots of quantum states.  If the entropy of a system is N bits, its number of quantum states is 2^N.  So, this black hole would have 2^(10^124) quantum states.

That's the biggest number I know that has any good physical meaning.  It's  big... but still tiny compared to plenty of numbers I can easily name, like Graham's number.   And next I'll tell you about some some numbers that make Graham's number look tiny.

By the way, don't take this picture very seriously!  It's just cute.  But it comes from a good page on black hole entropy:

http://www.scholarpedia.org/article/Bekenstein-Hawking_entropy

#bigness #astronomy  ﻿
96
66
101 comments

This is quite cool. If you don't mind me using the comments here, I would need your opinion about something related actually. Recently I was looking at the Wikipedia entry for the Bekenstein Bound and it had the following (and what I thought was misleading) account regarding the Human brain http://en.wikipedia.org/wiki/Bekenstein_bound :
"An average human brain has a mass of 1.5 kg and a volume of 1260 cm3. The energy (E = mc2) will be 1.34813×10^17 J and if the brain is approximate to a sphere then the radius (V = 4πr^3/3) will be 6.70030×10^−2 m.
The Bekenstein bound (I ≤ 2πrEħc ln 2) will be 2.58991×10^42 bit and represent the maximum information needed to perfectly recreate the average human brain down to the quantum level. This implies that the number of different states (Ω=2^I) of the human brain is at most 10^(7.796405960701×10^41)."
While this is cute and represents an upper bound, it immediately struck me that it was misleading as this would hold for any sphere of that size and hence the "emphasis" on a brain was sort of to get attention and I put it on hold. Do you think it deserves to be put back and if so how should it be reworded or introduced because certainly it is not clear how many states an actual brain could take and the Bekenstein bound might be off by quite a bit?﻿

Thank you, I was wondering about just this thing, since your previous black hole post.

But now I am wondering if there is any way to intelligently speak of or quantify meta-information. If I have a collection of atoms, and they happen to form the works of Shakespeare, can I legitimately say there is additional information there, in addition to what we calculated via entropy?﻿

Thank you for letting a thirteen year old schoolboy(me!) know this.﻿

- I'd be quite happy to have that stuff about the human brain gone from the Wikipedia page on the Bekenstein bound.  As you note, this bound has nothing to do with the human brain per se, and it gives an absurd overestimate of the number of brain states.  On a lesser note, it's also bad science to use numbers whose precision exceeds their accuracy, like saying the human brain would be 6.70030 centimeters in diameter if it were a sphere.  All this sort of thing would be fine in a goofy blog post, but not an encyclopedia article. ﻿

Thank You! I wanted to ask an expert and your excellent post gave me a fit excuse to ask!﻿

- There's a lot one might say about this.  Shannon's theory - deeply connected to entropy - reached a practical way of quantifying information without settling harder question of which information is 'useful' or 'interesting'.  However, we can certainly measure, separately, the Shannon information in a string of words on a page and the information it takes to describe the molecules in the paper itself.  The latter vastly outweighs the former!  But if the latter doesn't matter to us, we can focus on the former.﻿

Thanks

However, I wonder if we should consider this a lower bound to the universe's information? The observable universe has something like 10^184 Planck volumes, and there are some 10^80 atoms in the universe. I am not great with Combinations, but off hand, that looks like a lot more than 10^124 bits. ﻿

Would that be (10^184)! / (10^80)!   different combinations ? ﻿

wrote: "However, I wonder if we should consider this a lower bound to the universe's information?"

No, it's supposed to be an upper bound.  I'm not sure what calculation you're trying to do, but I'm essentially saying there are at most 2^(10^124) ways a ball of stuff the size of observable Universe can be, even if you put in a lot more matter than there is now.  You can try to estimate the number of ways you can arrange the atoms we have here now, but remember 1) hydrogen atoms are all the same, nothing at all changes when you switch two of them, and 2) you can't squish an atom into a Planck volume without using a huge amount of energy.  It would be fun to use the uncertainty principle to estimate how much energy it would take to do this for all atoms in the observable universe, and compare that to the amount of mass of a black hole the size of the observable universe.  If the former quantity is bigger than the latter times c^2, it means the black hole wins.﻿

I was essentially trying to calculate the number of ways to arrange 10^80 indistinguishable atoms, CENTERED in 10^184 Planck volumes.

Do we need to squish them, or is centering them on a planck volume enough? ﻿

Everything you try will raise further issues.  If you deal with all these further issues and figure out how to pack more information into a ball than I did, you may become famous, because there are pretty good arguments for the upper bound I gave:

http://arxiv.org/abs/hep-th/9908070

(Sorry, not very easy to read but the best thing I could find.  It's full of references that deal with question.)

For starters, you seem to be giving some special significance to the Planck volume, without suggesting how you expect quantum gravity to enter this calculation.  Planck units typically come up in quantum gravity calculations because they're built from Planck's constant, the speed of light and the gravitational constant.  But there's nothing in quantum gravity that says we can get different quantum states of a bunch of hydrogen atoms by centering them within different Planck-volume-sized regions.﻿

Interesting. I will give it a look soon.

And no, I was specifically leaving out all quantum states and even easy vector information, like velocity. ﻿

Suppose we ignore quantum gravity and special relativity and just think about quantum mechanics.  Suppose we ask what's the dimension of the Hilbert space of a hydrogen atom in a box.  This is the precise way of what I was sloppily calling 'counting quantum states'.   The answer is: infinity.   So, the entropy of a single hydrogen atom in a box can be arbitrarily large in this simple model.

But if we ask about the dimension of the space of states where the energy is below some fixed value E, we get a finite answer, which gets bigger when the box gets bigger or E gets bigger.  The formula for this sort of thing is known.

So, as a first step toward making your count more precise, we could take the box to be the size of the observable universe, take not just one hydrogen atom but 10^80 of them, take E to be the energy corresponding to the mass of a black hole the size of the observable universe, and estimate the dimension of this space of states.

If it's more than 2^(10^124), that would be sort of surprising. ﻿

Well - I wrote it wihout reading Your comment, and is seems we wrote about the  the same issue...but I do not delete it...

"When you keep trying to stuff more information into some region, eventually you get a black hole." - this is based on the following assumptions:
(1) to representing information You have to use a matter ( because black hole You have to construct with matter, Einstein Equations are about matter,not about information).
(2) every kind of matter have at finite energy ( in quantum meaning),have finite number of states accessible

While we may obviously agree with (1), what about of (2)? If You discover quantum system, with infinite number of  distinct quantum states, available for finite energy You may decode within infinite information, without rising energy. Then You will get any amout information You like, packed, and do not get a black hole.
Are there any kind of matter with infinite number of distinct internal states at finite energy? I suppose we do not know.

So my question here is:

"Is there any quantum law or principle, that no quantum system can have infinite number of states at finite energy?"

Please take into account here that I do not say nothing about space time continuity here....﻿

- there are many levels at which we can tackle your question: "Is there any quantum law or principle, that no quantum system can have infinite number of states at finite energy?"  I said a little about this in ordinary quantum mechanics, but it's also fun in quantum field theory.

As the arXiv article I mentioned points out, in quantum field theory in a finite-sized box, we could get infinitely many states of energy < E if (and probably only if) there were infinitely different kinds of massless particles: photons, schmotons, etc...   But in reality we know only one kind of massless particle: the photon.  So this loophole is not too realistic.﻿

what comes into my mind, is rather a kind of particle with continue symmetry, so at equal one level of energy You have infinite number of different quantum states. It would be a kind of extreme degeneration for "internal" states.  As far as I know, no laws of quantum or quantum field theory exclude that...
For simplicity take quantum wavefunction phase into account. It is not quantized  nor measurable, but You can measure phase differences.)... AFAIK there is no "limitation  theorem" for that ( upper bound for the number of quantum states  for a system at finite energy) . And if there will be one - it would be very interesting...﻿

this whole business of observation is nonsense.﻿

That's actually an interesting point. A large number of particle species (i.e. quantum excitations above the vaccuum) would indeed violate the holographic entropy bound. But the Hawking evaporation time is inversely proportional to this number. It turns out that when the number of species starts to be big enough to threaten the entropy bound for black holes of a given size, they evaporate faster than they can form, so the Bekenstein argument cannot be applied. There seem to be an interesting and mysterious self-consistency hidden in there. This is reviewed in this nice paper, section II.C.4
http://arxiv.org/abs/hep-th/0203101
Also, this paper reviews the correct version of the entropy bound, which uses light-sheets and is a bit different from the spacial entropy bound discussed in these posts. Indeed, the latter fails for certain space-times.﻿

That sounds like a tiny number compared to the amount of information my talkative taxi driver tried to convey to me yesterday. ﻿

Here: http://en.wikipedia.org/wiki/Bekenstein_bound#cite_note-Bekenstein1981-1-1 there is a link to Bekenstein paper in which upper bound for entropy vs energy is introduced. There are two interesting  points here:
(1) degeneracy. Bekenstein assumes that for any energy level ( degree of freedom) degeneracy g_i is finite. As this is in agreement with anything I know about real physical systems, it may be an interesting point if it is kind of accident or general principle?
(2) please take a look into Bekenstein paper Appendix A where energy spectra for various systems are considered. Bekenstein have taken into account bosonic and fermionic fields as example. While for bosonic fields there is no problem with vanishing boundary conditions, for fermionic ones (actually neutrinos), there is non-trivial boundary configuration considered. Maybe it is just "ferminic sophistication" , and this is nothing important. But maybe this is a sign of something important here? If You want to generalise Bekenstein approach we have to assure that physical systems obey such boundary conditions as he assume, or that this assumption is general enough...

Sorry for such amateur remarks here, but it is very interesting theme indeed...﻿

and how big of a ball would we get if we compress the observable universe into a black hole? ﻿

Two quick questions. First, is the particle in a box the correct toy model to use? It gives us serious constraints because of the boundary conditions (which don't exist in the universe) and those BC restrict <x> severely.

Second, it is marvelous that in your calculation, the upper limit of entropy depends on the mass of the universe, but not its size. This is particularly interesting given that delta S = Q/T. As the universe expands, T goes down, but I expect Q must go down faster then, or the universe would continue to grow in entropy forever. The thing is, if we count actual position states, there is an argument you could make for that. ﻿

wrote: "First, is the particle in a box the correct toy model to use?"

I picked it to address the idea you raised in the simplest possible way.  You were imagining storing information in the position of a bunch of hydrogen atoms.   My idea was: so, lets use just quantum mechanics to put an upper limit on the information that can be stored in the quantum state of N hydrogen atoms in a ball of radius R having energy less than E.  This bound is pretty easy to calculate if we ignore the interaction between those atoms.  And then we can figure out what we get when E is sufficiently large that E/c^2 is the mass of a black hole of radius R.

"It gives us serious constraints because of the boundary conditions (which don't exist in the universe) and those BC restrict <x> severely."

If we take R to be the radius of the observable universe,  these boundary conditions say these hydrogen atoms lie in the observable universe.  That's just what we want, right?

"Second, it is marvelous that in your calculation, the upper limit of entropy depends on the mass of the universe, but not its size."

I'm not sure what you mean.  My original calculation uses the radius of the observable universe but not its mass.  The new simplified calculation I'm talking about mentions the mass of a black hole of radius R, but that's a function of R.  (It's proportional to R.)﻿

<x> would be saying a position in the "center" of the universe, but not possibly located elsewhere. If we had a single proton, then there are lots of places we could put it. But in a box, <x> is always the same. Am I reading too much into <x> ?

Second, my bad, I read your calculation wrong. I will look over it more closely. ﻿

wrote: "and how big of a ball would we get if we compress the observable universe into a black hole?"

That's a fun question!  Luckily I have all the relevant numbers up my sleeve.  First let's ignore dark matter and do a quick back-of-the-envelope calculation.  The universe has about 100 billion galaxies each with about 100 billion stars.  So, the mass of the visible matter is roughly 100 billion times 100 billion times the mass of the Sun, or 10^22 solar masses.  Since the radius of a (nonrotating, uncharged) black hole is proportional to its mass, squashing all this visible matter into a black hole would give us one that's 10^22 times as big as a solar-mass black hole.

I forget how big that would be, but I can look it up... a solar-mass black hole would have a radius of 2.95 kilometers.  So, the rough answer is about 3 × 10^22 kilometers.

That's about 3 billion light years!  By comparison, the current-day radius of the observable universe is 46 billion light years.  That's surprisingly close, all things - but the surprise is really that the mass of a black hole is proportional to its radius, not its volume.  They're less 'dense' when they're bigger.﻿

actually You are mean http://en.wikipedia.org/wiki/Schwarzschild_radius, do You? So this is a radius of the Event Horizon, but it may have nothing in common with size of a "material object" within! In fact Event Horizon for Earth has about 1 cm and it is within Earth deep under our feet, now, at present. It is only a parameter. Specially it does not tell You anything about density!﻿

It does, however, tell you how dense matter has to get in order to collapse into a black hole in the first place. The surprise is that in the limit of a very large hole this goes to zero, so that if you have a sufficiently large amount of matter, making a black hole is in some sense not very hard, and some sort of exotic physical reactions in the matter will never keep it from happening.﻿

Is it really so surprising that the Schwarzschild radius for the mass of the observable universe is close to the actual size? By naive dimensional analysis, I'd think they'd be pretty close if (1) the universe is near the critical density for flat spacetime (presumably because of inflation), and (2) we aren't too far into the dark-energy-dominated acceleration epoch.

I'd think they'd be closer if you're including dark matter.﻿

I put "density" in quotes because I was referring to the mass of a black hole divided by volume of a sphere (in flat space!) whose radius is the Schwarzschild radius.  As Matt notes, this tells you how dense a ball of matter has to be to collapse into a black hole.  So for all the visible matter in the so-called observable universe, we'd only need to increase its density by a factor of about (46/3)^3 to get it to collapse.  This is roughly 4000.  Right now there's about 1 atom for each 5 cubic meters of space, so we just need a density of 800 atoms per cubic meter to cause a collapse.﻿

Do You mean that density limit for very large mass goes to zero? Yes indeed. In fact for very big mass it may be, that above the event horizon is "quite a lot of space" ;-) because of that ( I mean there is not an exotic subnuclear matter compressed etc. but ordinary barionic matter ). In old books they have wrote that for very massive black hole, observer falling down may not observer any anomaly at event horizon, nothing had to be changed, but way out is impossible... But I have saw, even on G+ discussion about firewall ;-) locater there so I am not so sure if it is sill valid point of view. ...

"This is roughly 4000.  Right now there's about 1 atom for each 5 cubic meters of space, so we just need a density of 800 atoms per cubic meter to cause a collapse." - well - this is really surprise!﻿

This is, in fact, the whole reason why the firewall claims are so disturbing!﻿

- " Is it really so surprising...?"

No, after you're done being surprised it's not so surprising.  I actually remember noticing something like this much earlier...

Ah yes, here it is... my article about the long-term fate of the universe:

"Why does the temperature approach a particular nonzero value, and what is this value? Well, in a universe whose expansion keeps accelerating, each pair of freely falling observers will eventually no longer be able to see each other, because they get redshifted out of sight. This effect is very much like the horizon of a black hole - it's called a "cosmological horizon". And, like the horizon of a black hole, a cosmological horizon emits thermal radiation at a specific temperature. This radiation is called Hawking radiation. Its temperature depends on the value of the cosmological constant. If we make a rough guess at the cosmological constant, the temperature we get is about 10-30 Kelvin.

A black hole only shrinks by evaporation when it's in an environment cooler than the temperature of its Hawking radiation - otherwise, it grows by swallowing thermal radiation. The Hawking temperature of a solar-mass black hole is about 6 x 10-8 Kelvin, and in general, it's inversely proportional to the black hole's mass. The universe should cool down below 10-8 Kelvin very soon compared to the 10^66 years it takes for a solar-mass black holes to evaporate. However, before that time, such a black hole would grow by absorbing background radiation - which makes its temperature decrease and help it grow more!

If a black hole ever grew to about 10^22 solar masses, its Hawking temperature would go below 10-30 Kelvin, which would allow it to keep growing even when the universe has cooled to its minimum temperature. Of course, 10^22 solar masses is huge - about the mass of the currently observable universe!"

So, the "non-coincidence" is that the temperature of the universe will approach the Hawking temperature of a black hole whose mass is roughly equal to that of the observable universe.﻿

Is the Bekenstein bound a mathematical theorem, say in the sense of one which a number theorist completely uninterested in physics would recognize, or there are some not completely formalized  hypotheses and parts of the proof which make this a physical principle?﻿

Bekenstein bound take into account some physical asumptions, like boundary condition or particular matter properties ( fermionic, bosonic) so it is not pure mathematical statement. But it is close to that and it would be very interesting to formalize it in some way, and I believe such formalization is possible...﻿

An analysis guy like me would ask questions like: what do you mean by area, where is entropy defined, what is the regularity class of the objects you use, could I take your universe as being a bag with ten marbles or you forgot to tell me something, are you supposing everything is continuous, differentiable, measurable, Holder, what? Is there a mathematical statement where one can see the entire  hypothesis and the conclusion? I tried several times to understand this very intriguing subject and each time I stopped. That is why I ask.﻿

That is some great math. Luv math.it just kind of comes natural to me.﻿

I am only an amateur in physics, so I may be wrong, but I suppose it may be formalized.First You have physical quantum system ( fermionic or bosonic case) described by Hilbert space for quantum field theory ( You may start with statistical mechanics if You wish, since they are related by imaginary time transformation). Then You have to define entropy and energy as functional on this space together with boundary conditions of some kind. Next move is to formalize reasoning of Bekenstein, and central object would be "states density" function, for which You will looking for a upper bound as much general as possible - in Bekenstein reasoning it is majorized ( is that such word in English?)  by beta-function-like integral formula. On every stage You have well defined mathematical objects. Interesting point is: which level of generality You may achieve. Bekenstein reasoning looks like rather particular in mathematical meaning for me. But who knows..﻿

It is interesting to calculate the mass of the observable universe and compare it to the mass of a black hole of the same size. As Matt pointed out you get a similar answer. If you consider the mass in a region with radius twice the size of the observable universe there will be eight times as much so naively it should form a black hole. This argument is wrong because the universe is expanding and the rule about forming a black hole inside the Schwarzschild radius only applies to static matter.

So the more interesting comparison is to look at the total amount of entropy in the universe compared to this upper bound of about 10^124. Most of the entropy is in super-massive black holes so it can be done roughly. It will fall short of the upper bound by a factor that could be between  10^15 to 10^20 (feel free to improve on this estimate) Assuming a homogeneous universe the amount of entropy increases with volume but the upper bound increases with area, so on scales which are 10^20 bigger than the observable universe the total entropy exceeds the Bekenstein bound.

Something has to give and there are three possibilities worth mentioning that I can think of. (1) the universe is closed and finite and smaller than this limit. (2) The universe is not homogeneous with matter density eventually decreasing inversely proportional to radius or faster (3) The Bekenstein bound also has to be modified to accommodate the expansion of the universe. Which is it?﻿

the fourth - space itself has not finite ( has infinite) numbers degrees of freedom ( maybe it is continuous?) so  quantum gravity do not obey holographic principle, nor Bekenstein bound, which is only valid for barionic matter. Why not?﻿

Im doing this on paper.I did take reconsideration of all the variables .that you gave me all the information that I needed.the rest is basic geometry .you gave the radius,I know the equation of a sphere .but there were some things I didn't no. But after I put those numbers in ,it seems your pretty close.but as well my numbers could be off.like i said I'm doing this with pencil an paper .thanks for sharing .﻿

wrote: "An analysis guy like me would ask questions like: what do you mean by area, where is entropy defined, what is the regularity class of the objects you use, could I take your universe as being a bag with ten marbles or you forgot to tell me something, are you supposing everything is continuous, differentiable, measurable, Holder, what?"

Wait a century and there will be a book that answers these questions.  For now the most rigorous treatment I've seen is this:

http://arxiv.org/abs/hep-th/9908070

It includes a lot of physical discussion of why some hypotheses are reasonable, but then there seems to be a proof - using differential geometry - that something like the Bekenstein bound follows from these hypotheses.  I don't promise that every step is justified, but it looks like standard differential geometry to me, sort of like the Hawking-Penrose black hole theorems.﻿

This paradox is exactly why you need the covariant entropy bound using light sheets. The spacial entropy bound doesn't always hold, as your example shows. I recommend the paper I mentioned in a previous comment for detailed explanations.

Also, the question of whether the matter density in our universe is enough to form a black hole can be more precisely rephrased as whether we live in an open or closed universe. In an open universe, the expansion continues indefinitely. In a closed universe, the expansion stops, the universes starts contracting and ends up in a big crunch. This is analogous to living into a black hole in the sense that all trajectories end up hitting a spacial singularity. Observations show that we live in an open universe, but not by much.﻿

All I'm doing gentlemen ,is just useing the info that I was given.an that the sphere was stationary ,not changing in size! The is a lot of variables .but it was fun. Even if I'm off ,its still fun .﻿

I wrote: "For now the most rigorous treatment I've seen is this:

http://arxiv.org/abs/hep-th/9908070 "

However, I should emphasize (especially for Marius) that this is a purely classical version of the Bekenstein bound - that is, no ideas from quantum mechanics are used.  So, for physicists this is just a step toward the ultimate goal.  The ultimate goal would be to derive something like the Bekenstein bound from a theory of quantum gravity.  But we don't really have a theory of quantum gravity that's good enough to do this... unless we use a lot of hand-waving.  So, the Bekenstein bound that physicists are most interested in is nothing like a 'theorem' at this time.﻿

in the light of tip, it looks like I was  wrong ;-) I have hope that at last, in an interesting way ;-)﻿

surely you can only equate closed/open with crunch/eternal if the cosmological constant is zero, and it appears not to be. I thought observation shows that the observable universe is spatially flat as best as we can measure. The curvature could be negative, positive or zero. I was not aware that it had been observed to be negative to show that the universe is open (ignoring possibility of more complex topology). Is that the case?﻿

Right, my remark was imprecise, thanks for the correction.﻿

and thank you for the reference, I will look at it.﻿

wrote: 'But in a box, <x> is always the same."

No it's not.  If a particle's wavefunction is nonzero only in the left of the box, <x> will be different than if it's in the right.﻿

wrote: "Something has to give and there are three possibilities worth mentioning that I can think of. (1) the universe is closed and finite and smaller than this limit. (2) The universe is not homogeneous with matter density eventually decreasing inversely proportional to radius or faster (3) The Bekenstein bound also has to be modified to accommodate the expansion of the universe. Which is it?"

Good question!  I see no reason to believe that (1) or (2) is true for our universe.  Since they involve the unobserved portion of our universe we can't rule them out definitively, but it certainly seems theoretically possible that the standard model of cosmology is correct and the universe is homogeneous and spatially almost flat at large length scales, which would contradict (1) and (2).

(3) seems to be the answer. Indeed, in his Scholarpedia article Bekenstein writes:

'In an infinite universe the holographic bound fails when applied to a sufficiently large region. In a closed (finite) universe the specification of R or bounding area becomes ambiguous."

That is, if the universe were closed, we could think of a small bottle as containing the whole universe outside that bottle.  So, it seems the bound needs to be clarified to be true.  I hope you - or someone - reads the Flanagan-Marolf-Wald arXiv paper I linked to, which claims to prove a theorem, and sees how this theorem dodges the problem you raise.﻿

wrote: "The curvature could be negative, positive or zero. I was not aware that it had been observed to be negative to show that the universe is open (ignoring possibility of more complex topology). Is that the case?"

Just to answer this (perhaps rhetorical) question, I believe the best estimates of the current spatial curvature are consistent with it being zero.  If so, the universe could be infinite in extent, that is R^3, or finite in extent, that is, some flat torus S^1 × S^1 × S^1 covered by R^3.  Or, for that matter, an enormous cylinder like R × S^1 × S^1 or R^2 × S^1.  Wouldn't that comes as a big surprise!﻿

Any of these options and others could be right of course. We are far from knowing yet. However, to add some balance I have to say that I don't like option (3). The holographic principle has been argued for on the basis of black hole thermodynamics and these arguments may not apply for an expanding universe with dark energy. Yet ultimately the bound should follow from a deeper theory of quantum gravity. This may mean that physics really can be described by a theory living on the boundary surface with entropy limited by its area in Planck units. In that case the simple version of the bound would hold no matter how big the surface was. I can easily accept a universe that is closed and expanding for ever, or even one that is in-homogeneous on large scales. The cylinder might work too. ﻿

wrote: "This may mean that physics really can be described by a theory living on the boundary surface with entropy limited by its area in Planck units."

It may mean that, but 1) we don't know, and 2) observations so far make it plausible that our universe has a roughly constant entropy per unit volume.  So I would use what we seem to be observing to correct and improve our theories of quantum gravity, not insisting on principles we're not sure of.  But physics is a multi-player game and it's good to have people working all the angles.﻿

said, "the most information we can fit into the observable universe is 1.4 × 10^124 bits."

If you calculated the amount of information a few fempto seconds after the Big Bang would you get the same amount?

And if not, where did all the information in the universe today come from if it was not present at the Big Bang?﻿

"If you calculated the amount of information a few femto seconds after the Big Bang would you get the same amount?"

No, because the maximum amount of information you can fit in a ball is proportional to the surface area of that ball, and the universe has been expanding.

"And if not, where did all the information in the universe today come from if it was not present at the Big Bang?"

I didn't calculate the amount of information that is in the big ball we call the observable universe.  I calculated the maximum amount you could fit in this ball, if you packed it to the point where it formed a black hole.  There's not actually enough matter around to do this.

It's sort of like taking a big empty room and calculating how much information it could hold if it were stuffed full of DVDs.﻿

Ah, I see.  Is there a way to calculate how much information there is in the universe?﻿

I think you are saying it is greater than zero and less than 1.4 x 10^124 bits, but can we narrow it further than that?﻿

Yes, it's between those numbers, but I wasn't trying to narrow it down: I was trying to come up with the biggest possible number that has any physical meaning.  Namely: if the observable universe were a black hole, this black hole has 2 to the 1.4 × 10^124 different quantum states.

"Is there a way to calculate how much information there is in the universe?"

First, I need to pedantically scold you: I'm always talking about the observable universe.  This seems to be very small compared to the universe.  Since we can't observe the unobservable portion - yet - we can't be sure.  But the universe seems to extend infinitely in all directions.

Second: yes, we can say something about the amount of information in the observable universe.  We can estimate the total amount of entropy in the stars, gas, dust and black holes in the observable universe.  This is basically the amount of information it would take to describe everything about that stuff.

(The dark matter is poorly understood, so I wouldn't want to try that.)

Further up this comment thread, Phil Gibbs estimated the entropy for the supermassive black holes in the observable universe.  He claimed that most of the entropy is due to these guys.  While that seems really plausible, I'd like to check.

In this post:

https://plus.google.com/u/0/117663015413546257905/posts/6jggh2yDYUQ

I calculated the amount of entropy of the single biggest supermassive black hole we know: 1.7 × 10^98 bits!﻿

" if the observable universe were a black hole, this black hole has 2 to the 1.4 × 10^124 different quantum states."

Thanks. Although his numbers are at a much smaller scale than yours, Seth Lloyd did a calculation of the limits of computation (rather than just information), imagining if we could turn a black hole into a computer.
http://arxiv.org/abs/quant-ph/9908043v3﻿

John Baez does Archimedes﻿

Heh.  I just learned, by the way, that Archimedes wrote his Sand Reckoner as a letter to Apollonius.  That makes it seem even more cool.  And so yes, this is the updated version.﻿

Hmm, Wikipedia says it was written to King Gelon, and that must be true in some sense, but a historian of mathematics said it was part of a (mostly lost) correspondence with Apollonius.  Now I have to investigate!  That'll keep me occupied until I fall asleep....﻿

My best guess for Davies' book is the "Cosmic Jackpot".

Lloyd goes on to estimate the total number of "ticks of clocks" in http://arxiv.org/abs/1206.6559

Somewhat more speculative, entanglement (and I would with even less certainty say other uniquely quantum effects) should impact these bounds in some way by increasing the amount of energy that can be confined to some volume of spacetime, to be a bit imprecise. Again I could be wrong in my memory, but this seems to be a reason for associating dark energy with such effects at least in the papers by Jae-Weon Lee which I think may have been motivated by the observation that Black Body Radiation is not entangled (at least in spin and OAM) http://arxiv.org/pdf/quant-ph/0510137.pdf, i.e, entangled photons when taken singly would have less energy than the linear relationship of Wien's displacement law.

Either way it is intriguing how the classical calculation differs from a full quantum one.

Lastly, it really bugs me that I can download a billion arxiv papers to my computer and it does not change weight. Furthermore, the ink of the text increases the weight of paper, scratches with a knife as text would remove paper and decrease weight, indentations on clay would keep the weight constant. The last example seems to imply there is more to it than thermodynamics. What is the relation between information and mass/energy? Is this something that is known and if so where is the source of my confusion?﻿

I have to object to your calculation, though. As you know in the Schwarzschild solution one can define a radial coordinate measuring proper radial distance, or one can define a radial coordinate by dividing the area of a sphere at a given distance of the horizon, and defining the radial coordinate to be that which corresponds to a Euclidean sphere. The two don't match. (I only mention Schwarzschild because this particular pair of coordinates should be well know to anyone who has had any exposure to General Relativity, not because it's directly relevant to our example)

So I think you have to estimate the area of the visible sphere directly, you can't claim that we have some other measure of distance to the edge of the observable universe and then take a sphere of that radius.

Also, if you filled the universe with so many bits, a black hole would be generated. Or at least a cosmological horizon. So maybe we can think of the cosmological horizon of Anti-deSitter space as a model for this sort of calculatio. After all, if the current cosmological estimates are right and space is accelerating its expansion, in the distant future we may well see a cosmological horizon not unlike the AdS one.

You also write "the surprise is really that the mass of a black hole is proportional to its radius, not its volume.  They're less 'dense' when they're bigger.". Again that's the radius defined as the square root of the area of the horizon (up to a factor of 4 pi), it has nothing necessarily to do with the volume of the space inside.

But the fact that the area of the black hole grows as the square of its mass implies that the gravitational acceleration (and the tidal forces) near the horizon get weaker as the black hole mass gets larger.﻿

>First let's ignore dark matter

Did you just ignore 90% of what you were calculating?

>That's about 3 billion light years!  By comparison, the current-day radius of the observable universe is 46 billion light years.

And if you really only used 10%, then 100% would give more like 30 billion light years, which is even closer.﻿

worte: "So I think you have to estimate the area of the visible sphere directly, you can't claim that we have some other measure of distance to the edge of the observable universe and then take a sphere of that radius."

Like many other people I'm defining the observable universe to be a sphere in the spacelike slice at constant cosmological time that contains us now.  People now believe this slice is close to flat -  that is, Euclidean space.  I'm using this to compute the sphere's area using the usual formula, 4 pi times radius squared.

Unlike many other people, I freely admit that all but one point in this 'observable universe' is not observable by us right now.﻿

wrote: "And if you really only used 10%, then 100% would give more like 30 billion light years, which is even closer."

Right!

By the way, I'd need to check some stuff before trying to compute the entropy of the observable universe.  I'm really hoping Kevin Kelly will pay me somehow to do it.  My guess is that most of the entropy is in supermassive black holes, even if we count dark matter, because black holes have such high entropy per mass (and this ratio grows like the square of the mass).  But to be sure, I should find the current estimates on how much of the universe's mass is in the form of supermassive black holes, and the entropy per atom of hydrogen in the universe... and other stuff.  If dark matter is WIMPs, someone probably has guessed the entropy of that too.  But if it's primordial black holes, as Daniel Estrada keeps insisting, that could mean a lot more entropy.﻿

How many bits? Surely, all of them...﻿

It looks like the number of states should be rapidly decreasing with time, due to the shrinking horizon imposed by accelerating expansion.﻿

If expansion keeps accelerating and this means there is a cosmological horizon, stuff keeps falling behind the horizon. However, more "space" continues to be created and this increases the amount of possible states within the cosmological horizon.﻿

What is so profound in reasoning which goes as follows - You assume that within finite volume is finite amount of states possible, and You assume they are defined by boundary conditions on the boundary hyperspace. And You get that there is a bound on the number of states within this hyperspace because there is only finite number of different boundary conditions I understand that it maybe hard to prove this i formal way. But it is surprising? Not at all. Every is just within assumed

I completely do not understand relation between matter and information. While I agree that within finite volume there may be only finite amount of matter, I do not understand why there may be only finite amount of information. If relation between information and matter is straightforward - the only possibility to write some information, is to control some degrees of freedom - then this whole reasoning with Bekenstein bound and holographic theorem just has simple meaning: "every field state within finite volume is defined by boundary conditions on boundary hyperspace" It looks to be true, but is hardly surprising, or new, if everything inside is gauge field. So what is the point - why it is so surprising? It is true for nearly every differential equation we use in theoretical physics - not because it is general, but because we hardly can work with equations which do not obey such relation.
So is holographic principle a new sexi way to say something what mathematicians knows for a 2 or 3 centuries? Or has it serious and new physical meaning? Where?

BTW. a lack of the formal proof is interesting because it is a hard part, and I suppose it is completelly no true, because even simple, but general equations do not obey such kind of argument - take a look for Navier-Stokes equation ( turbulence) and try to use this analysis http://terrytao.wordpress.com/2007/03/18/why-global-regularity-for-navier-stokes-is-hard/  Is situation in GR/ST any easier than that?﻿

wrote: "It looks to be true, but is hardly surprising, or new, if everything inside is gauge field."

It's actually new.  In an ordinary quantum field theory in a cubical box of length L in flat spacetime, the entropy of the maximal-entropy state at temperature T is approximately some function of T times L^3.  This is even true for gauge fields, e.g. electromagnetism, where the answer is about T^3 L^3.   It's not true that for gauge fields the state of the interior can be reconstructed from the state on the boundary.  We only get a holographic principle - that is, an answer that grows as a lower power of L, like L^2 - if we take gravity into account.  And deriving this holographic principle has not really been accomplished, except in a hand-wavy way, because we don't have a good enough theory of quantum gravity.  In particular, we should ultimately take states of the quantized gravitational field into account when computing entropy.

For a clear and nontechnical discussion of these issues try this:

• Gerhard 't Hooft, Dimensional reduction in quantum gravity, http://arxiv.org/abs/gr-qc/9310026.﻿

There is no reason why the entropy per unit proper volume cannot also be O(1). In that case, the volume enclosed scales linearly with the boundary area, which is a specific statement about the geometry.﻿

"if we take gravity into account" - but where You get it? Without single equation for gravitational field?
I have saw only two things:
(1) assumption that there is no degrees of freedom below Planck scale - in reasoning above. It has no physical meaning ( are there any, even indirect, observations for that? ) . It may be true. But as well it may be false. Maybe I do not know about that ( as I am amateur) so I would like to ask You  - is there any theory for that except dimensional analysis? Is it a hard result ( in mathematical meaning) of any of present theories? Or it is statement based on assumption that there are only finite energy possible, and we have to put cuttoff in our theory, because it is useful i calculations?

(2) assumption that due to particular bounding conditions S_total = S_black_hole + S_matter. S_black_hole is not even defined ( there is only an analogy between entropy and area of the event horizon ), and it is even not clear why information cannot be lost. This definition is correct in week gravitational field ( I suppose that this is statement about geometry of the spacetime which allows to calculate volumes and areas etc. ) .

If an egg drop from the table, information about it internal state is lost. It is a kind of theological reasoning that "osculations of a atoms of the floor take information about internal egg structure so in principle it is still present in the Universe".

It is completely unclear why system containing some amount of matter has to end as some numbers of black holes.  Where is any reasoning for that? There are many examples of stable configurations consisted of matter - without black holes. Thermodynamic equilibrium states for systems with gravitation do not have to be consisted of black holes only. So even if equation S_total = S_matter +S_black_holes may be valid valid, it may have nothing in common with  Bekenstein bound.  Where am I wrong?

Information here is treated as "strictly defined notion" even that it in fact we do not know how to relate it to states of a matter. It is a kind of "Democritus of Abdera like" reasoning - it is a philosophical theorem, not physical one... It is even not mathematical statement...  Sorry, I do not buy it.﻿

wrote: "Is it a hard result (in mathematical meaning) of any of present theories?"

I already told you:

"And deriving this holographic principle has not really been accomplished, except in a hand-wavy way, because we don't have a good enough theory of quantum gravity."

Don't look for something that's not there.  This is not calculation within an existing theory, this is fundamental physics: that is, the desperate struggle to find out new principles governing nature, upon which a theory might be based.﻿

I understand that, and I appreciate that! But what I read about that, is different from what You are saying here. There is many people who wrote that "we know that!" or "it is strong indication that it is true"  I finish theoretical physics many years ago ( I am working for years in unrelated field, as network administrator), I was working on renormalization of Navier-Stokes equations for particular kind of fluid model. I do not understand why people believe in such week reasoning so enthusiastically.... It is so far from exactness standard for reasoning for Standard Model. I do understand that it is "new physics" and we have hope something big will arise from that. But it is far, far from any physics I know or like. Not because it is hard, but because the level of hypo, and hand waving...

And when You say something like that: "this is fundamental physics: that is, the desperate struggle to find out new principles governing nature, upon which a theory might be based." it remind me that when quantum mechanics arise, there was strong disagreement about its revolutionary way of description. But there was a couple of experiments which where correctly described by QM. Correctly from the very of its beginning. There is any, as far as i know...﻿

In the wikipedia article on the Beckenstein bound, it says

"Bekenstein argued that it would be possible to violate the second law of thermodynamics by lowering it into a black hole. In 1995, Ted Jacobson demonstrated that the Einstein field equations (i.e., general relativity) can be derived by assuming that the Bekenstein bound and the laws of thermodynamics are true."

To me this seems like a pretty strong argument. The laws of thermodynamics were, after all, used by Planck to postulate quanta and by Einstein to argue that stimulated emission needed to take place at a precise rate. So your analogy with the prehistory of quantum mechanics may actually point in the opposite direction than you think.

http://en.wikipedia.org/wiki/Bekenstein_bound﻿

Wow - even you are misspelling 'Bekenstein', Miguel?﻿

wrote: "It is so far from exactness standard for reasoning for Standard Model."

Yes, of course!  Gerhard 't Hooft, whose paper I pointed you to, won the Nobel Prize for proving the perturbative renormalizability of gauge fields - a key step toward the Standard Model.  His paper here is much further from being rigorous, not because he's become stupid, but because quantum gravity is a problem that we're just beginning to understand.

"I do understand that it is "new physics" and we have hope something big will arise from that. But it is far, far from any physics I know or like."

The big problem is that physicists don't have any specific experimental data they're trying to explain.  It's a bit like trying to understand atoms in a world where Balmer hadn't discovered his formula for the spectrum of the hydrogen atom.  Bohr could have discovered his Bohr model but we couldn't tell it was on the right track - all we'd see is how weird and stupid-sounding it was!

I quit working on quantum gravity after 10 years or so, for essentially this reason.  It was a huge amount of fun to work on, since I helped develop spin networks and spin foams, and learned a lot of physics.  But I didn't want to spend my whole life doing things without a clear sign of whether I was succeeding.﻿

I plead speed typing...﻿

Speaking of tests for quantum gravity, whatever became of those experiments with neutrons that showed (or at least suggested) quantization of gravity? ﻿

I guess you're talking about this?

http://backreaction.blogspot.com/2007/06/bouncing-neutrons-in-gravitational.html

This doesn't really suggest quantization of the gravitational field, it just shows that quantum particles interact with a gravitational potential in the way you'd expect.  In other words, you compute the answer by treating the gravitational potential as a plain old function of height, not as a quantum field... and the answer matches experiment.﻿

"In 1995, Ted Jacobson demonstrated that the Einstein field equations (i.e., general relativity) can be derived by assuming that the Bekenstein bound and the laws of thermodynamics are true." - so Bekenstein bound is stronger assumption than Einstein GR Equations. Not the opposite. But as far as I know the goal is opposite...﻿

Ok, that is the quantum gravity experiment I was thinking of, but alas, not nearly as meaningful as I remember it. Thanks﻿

"so Bekenstein bound is stronger assumption than Einstein GR Equations". I'm not sure about that. Einstein GR + Thermodynamics => Black hole thermodynamics including Bekenstein Bound. But what Jacobson wrote is different from what's claimed in the Wikipedia page. From the abstract of http://arxiv.org/abs/gr-qc/9504004:

The Einstein equation is derived from the proportionality of entropy and horizon area together with the fundamental relation $\delta Q=TdS$ connecting heat, entropy, and temperature. The key idea is to demand that this relation hold for all the local Rindler causal horizons through each spacetime point, with $\delta Q$ and $T$ interpreted as the energy flux and Unruh temperature seen by an accelerated observer just inside the horizon. This requires that gravitational lensing by matter energy distorts the causal structure of spacetime in just such a way that the Einstein equation holds.

And the Unruh effect can be derived entirely within flat space quantum field theory. The Unruh effect is an analogue of Einstein's equivalence principle ("gravity is indistringuishable from acceleration") in that it tells you how to transform Quantum Field theory between relatively non-inertial observers.﻿

I have just read Jacobson paper and it is very interesting indeed.  Thank You very much for such reference. As far as I understand what he is saying, he clearly states, that his reasoning describes thermal equilibrium, that is entropy calculation when curvature of the spacetime is near to zero locally, at every point of the boundary he use. Anather way to say this is - when there is not much degrees of freedom excited during the process

That is exactly what I am worry about when somebody wrote   S_total = S_matter +S_black_holes ( take look at my earlier comment) and then says that eventually only S_black_holes remains so he can calculate some kind of limit for S_total by using S_black_hole formula. This is how Bekenstein bound is derived.  I am worry about the path of the termodynamical process he should use in order to transform all the matter into black holes. If this process do not act as thermodynamical equilibrium for all intermediate states,  S_black_hole formula for such calculation is not correct - because during "collapse" we are outside equilibrium. So I am afraid that this reasoning is similar to phenomenological thermodynamics, and Bekenstein bound has only meaning for processes at equilibrium If so, every single word about holographic theorem is not fundamental physics, but description for very special kind of systems ( systems at equilibrium) with neglecting any fluctuations... So it may be, by analogy, no more than "mean field theory" not fundamental one...﻿

I'm not sure I understand your objections because it would be akin to disparaging the second law of thermodynamics on the grounds that quasistatic reasoning using Carnot cycles and such does not apply exactly to real physical systems out of equilibrium.﻿

well, it is good point. But I am not objecting "lack hole theromodynamics", or Bekenstein bound itself, but its relation to holographic principle. There is no stright way to go from bound on the number of excited states at equilibrium ( for example by Carnot cycle ;-) to bound on the number of possible states for fundamental system consisted of spacetime and some kind of matter. And what I am worry about is bound on the number of degrees of freedom at the "planckian scale" derived from that which is just fictional object for me... For equilibrium You have usually only very limited number of excitement in the system. And from microscopic point of view, the main point is that You have to be in contact with thermal bath Is it the rest of the universe? Or what? Or not-excited-degrees-of-freedom at point neighbourhood for spacetime? What are possible channels of  energy exchange here? Why finite speed of energy flux do not play any role here?
I strongly object to use canonical ensemble here - and this is what they are doing. As GR ( even in quantum world) is about total energy - they have to use much more general reasoning.... Bekenstein bound => Einstein equation is very interesting indeed, because it is not general for fundamental physics. It is a special cause. Bekenstein bound=> holographic theorem is only very limited conclusion for me.﻿

Thanks, that really clarifies your objection. (No, I can´t answer it right off the bat.)

Now, when you ask what the thermal bath the system is in contact with, I suspect that it's related to the infrared catastrophe. The low-frequency tail of all the quantum fields present is able to carry away information out to spatial infinity. The system tends to equilibrium with the cosmic background radiation. And local equilibrium arguments are involved in analyses of the Unruh effect.﻿

Well then, I'll bite: How much information is observed to exist in the physical universe?﻿

- wrote: "But what I read about that, is different from what you are saying here. There is many people who wrote that "we know that!" or "it is strong indication that it is true"."

Yes.  I'm more careful, or honest, or aware of the problems, than these other people.

One way people try to go from black hole thermodynamics to more general situations is to note that any small null hypersurface could, approximately, be part of an event horizon.  For example, we could be falling through the event horizon of a huge black hole right now!  There is no way to detect this locally.   So, if we can attribute entropy to the event horizon of a black hole, 1/4 nat per Planck area, perhaps we can do this to any null hypersurface.  The question then becomes: what does this entropy mean?

And this is why people start speculating about ideas like gravity being an 'entropic force'.  As you may know, Erik Verlinde won a 2.5 million euro prize for his work on this idea:

http://en.wikipedia.org/wiki/Erik_Verlinde
http://en.wikipedia.org/wiki/Entropic_gravity

but other people think it's nonsense.  As I said, everyone is desperately struggling to get ideas that work.﻿

- this calculation will take some real work, unlike the ones I've been doing so far in these posts.  And right now I need to prepare for class!  But maybe I'll give it a try.  Someone should already have done it, but if nobody has, it would be lots of fun to be the first!﻿

Do people think Verlinde work is nonsense or is it just that doesn't tell us anything we didn't already know? To give an example, 2 of the authors I mentioned above had substantial work clearly and intentionally using that idea without expressly using that phrase. So his work added to the theme of these in making it clearer to others, it didn't come from great creativity or deep insight but is still nice and I thought was a good paper. I suppose the attention it got annoyed some as it was significantly greater than the paper's contribution to the field. But that is independent of the science, that's human.

Regarding this post, it is interesting your calculation is comparable to the earlier one's I mentioned, one of which used the critical density of the universe.﻿

By "I thought is was a good paper", I mean I am glad it was written.﻿

Some people think Verlinde's paper is nonsense, some think it was great... great enough to give him 2.5 million euros!... and some think it wasn't novel enough to deserve that.  I haven't been paying much attention to this stuff, but Ted Jacobson's work seems a lot more profound, along with its many offshoots, like this: http://arxiv.org/abs/1205.5529.﻿
Add a comment...