First: entropy vs. information

I asked Susskind what the difference was. The qualitative description of entropy and information seem different. Entropy is how "messed up" something is, information is how much you worked at setting it up.

But when you get a quantitative definition, it suddenly appears they're the same. Entropy is the log of the number of microstates that correspond to a given macrostate. (You need a log because suppose you have a system with two independent subsystems. The first subsystem has N possible microstates. The second has M. The whole system then has N*M microstates. Its entropy is log(M*N) = log(M) + log(N). Defined as logarithms, entropies add.)

Information is the number of bits required to specify the exact (micro)state you've prepared. If you multiply the number of possible microstates by 2, you need one extra bit to describe which of those microstates you picked. So information is also a logarithm of the number of microstates.

Imagine I have a chess board and some pieces. The squares of the board are numbered. (This is done so that two boards that are 180 degree rotations of each other will count as separate microstates. You wouldn't actually have to number every square. You could mark one of the black corners as being the "A1" and the rest would be determined from there.)

Now suppose I have one pawn to place on the board. I can place it on any of the 64 squares, so there are 64 microstates (specific configurations) that correspond to the macrostate "one pawn on a chessboard". That means the entropy is log(64) = 8. (I'm using base 2. It doesn't really matter what base you choose because they are all related to each other by a constant.)

The information is the number of bits needed to describe the state. It is also 8. You must specify a number between 0 and 63 to indicate which square the pawn is on. That takes 8 bits.

If I had the entire slew of 32 chess pieces, nothing much would change. There would be many more possible configurations. Say N of them. Then the entropy would be log(N). But to find the information, I could make a long numbered list of every possible configuration of the pieces. To specify the exact position on the board, I find that position on the list and write write down its corresponding number. Since there are N numbers, I'd need log(N) bits to say which number I wanted. It's the same number as the entropy.

This is called the information because I could easily work the process the other way. If I wanted to send a message containing log(N) bits, I could put those bits in order to make a number, see which chess position corresponded to that number, set up the board that way, and send someone the board instead of the binary number. Once they got the board, they could work backwards (assuming they had the same list) to get the bits I originally intended to send.

The information a system has is how long a message it could send. This turns out to be mathematically the same as the entropy. So why use the term "information" at all? We already had the term "entropy" in place. Why not just keep it?

Of course my question was much shorter, and just asked what the difference is. Susskind said that entropy is "hidden information". His example was a bath tub of hot water. If there are waves on the surface, that's information because I could take a snapshot of them, analyze them, and see what sort of message they held (about where water is dripping in from a faucet, for example. (Many geologists are in this business.)) But the entropy of the bathtub is much higher than just the amount of information you could store in waves. The bathtub has 10^30 or so molecules of water. The log of the number of possible quantum states consistent with a tub of 10^30 water molecules at 300K would be the entropy and, in theory, the information. It's a huge number. But there's no point in saying the tub has that much information, because there's no way we could ever measure the state of all those water molecules. No one would ever try to send a message across enemy lines by setting up a bunch of water molecules in an exact quantum state to be decoded by a quantum-bathtub-reader in the general's field headquarters. (Here I'm deviating a bit from Susskind's exact analogy.) Even if you did, the bathtub would immediately interact with its environment, and decoherence would set in almost immediately. Although there's technically information there, we can't use it. It's still entropy, but doesn't earn the tag of information.

The heat death of the universe would be the extreme example, I suppose. In this theoretical state long in the future, everything has come into thermal equilibrium. The universe is a uniform temperature throughout, with no stars and galaxies and certainly no life. It's "as boring as possible", meaning there's almost no information because if you wanted to tell someone what the universe was like, you could basically give its size and temperature and maybe one or two other things. That would be all they needed to know. This is the state in which the universe's entropy is at a maximum, while its information is minimal.

Next we went into a calculation to demonstrate the importance of entropy for a black hole. The problem we're trying to solve is that John Wheeler used General Relativity to prove the "no hair" theorem for black holes. If black holes were bathtubs, they would have no ripples on their surface. There is nothing about a black hole that can give you detailed information. Black holes have only mass, electric charge, and angular momentum. Any two black holes with the same mass, charge, and angular momentum are just as identical as any two electrons. That means the entropy and information of a black hole are exceedingly tiny. In fact there is just one microstate corresponding to the macrostate we observe, so the entropy is zero! (I think. Don't quote me on that. Besides, this is the internet, meaning I'm basically supposed to get stuff wrong.)

If you toss a bucket of hot water into the black hole, you've added to it something with a great deal of entropy. But the black hole still has almost no entropy. So the entropy seems to have disappeared. This violates the second law of thermodynamics, which states that entropy always increases.

The calculation Susskind next presented he attributed to Jakob Bekenstein. I won't pretend to understand how to justify all the steps. I'll just repeat the general idea. I don't remember quite what all his steps were anyway. Somehow he avoided doing an integral, and never mentioned Boltzmann's constant. So my calculation is a bit different, but it starts and ends in the same places.

We'll imagine a procedure for adding mass and information to a black hole one step at a time. This is done by shooting a photon into the black hole. If we shoot a high-frequency photon, we're adding a lot of information. A high-frequency photon has high resolving power, so if we shot a high-frequency photon we'd be able to tell where it entered the black hole, which counts as information. We'd be adding a great deal of entropy. Instead, we decide to shoot a photon whose wavelength is comparable to the radius of the black hole. That way the photon cannot resolve the black hole at all. We add exactly one bit of information/entropy: whether or not the photon entered. The entropy is measured in units of Boltzmann's constant. (It turns out that if you shoot a photon with wavelength greater than the black hole, it'll probably just bounce off. Also, when I say, "it turns out", that means "I don't understand this")

We consider the photon to be an incremental energy addition, dE. The energy of a photon is

E = h*c/L

with h Planck's constant, c the speed of light, and L the wavelength. Therefore we can write

dE = h*c/L*dN

dE is the change in energy of the black hole. N is the number of photons we've shot in, so dN is just one. This term is there so we have a differential equaling a differential. (I made this up, too. I really wish I remember what Susskind wrote exactly.)

We're setting the wavelength L equal to the radius of the black hole.

dE = h*c/R*dN

The relationship between temperature and entropy is

dE = T*dS

with dE the change in energy of a system, T its temperature, and dS the change in entropy. For a black hole, we already know dE, because it's the energy of the photon, which we just calculated. We know dS, too (one times Boltzmann's constant). Plugging in what we already know about dE and dS we get

h*c/(R*k) = T

R = h*c/(k*T)

Here, k is Boltzmann's constant. This gives the temperature of the black hole given its radius, or the radius given the temperature. They're inverse proportional. Incidentally, the temperature of the universe is about 3K. Plugging in the constants, that temperature corresponds to a black hole of radius 5mm. Since real black holes are much larger than this, they are also much colder. Black holes are colder than their surroundings, so heat flows into them and they are all getting bigger. They'll continue to do so until the universe is much, much colder. Then they will begin to lose mass as their blackbody radiation emits mass in the form of photons faster than the black hole absorbs mass (also mostly photons).

The radius and mass of a black hole are directly related via

R = 2Gm/c

^{2}

This is because the radius, called the Schwarzchild radius, is whatever distance from a point source of mass m the escape velocity is that of light. The escape velocity depends on the kinetic energy needed to escape, which is equal to the potential well you're in. So the Schwarzchild radius depends on the shape of the potential.

The potential is the integral of the force, which is GMm/R

^{2}("M" the test object mass, "m" the black hole mass). So the potential falls off as GMm/R. The kinetic energy is 1/2*M*v

^{2}. Set v=c, set the potential and kinetic energies equal and solve for R. You get R = 2Gm/c

^{2}. This feels like it shouldn't work, since the kinetic energy of something moving at speed c is actually infinite (if it has mass), but somehow the answer pops out regardless. The real calculation a bit more involved.

The energy that's gone into a black hole can tell us its mass.

E = mc

^{2}

m = E/c

^{2}

Plugging this into the equation for the radius

R = 2G*E/c

^{4}

E = R*c

^{4}/(2G)

dE = dR * c

^{4}/(2G)

Finally we return to the definition of entropy and write everything in terms of R. We'll use the expression above for dE and the earlier expression for the temperature in terms of the radius.

dE = T*dS

dS = dE * 1/T

dS = [dR*c

^{4}/2G]*[(R*k)/(h*c)]

Integrating over dR from zero to the final radius

S = k*R

^{2}c

^{3}/(4G*h)

This is the result we were shooting for. The entropy of a black hole is proportional to its area. A better calculation gives slightly different constants, but the idea is the same. The number G*h/c

^{3}must be an area to make the units work out. It comes to 10

^{-70}m

^{2}. Its square root is the Planck length, 10

^{-35}m

^{2}. This is a length built out of natural constants, and it characterizes this particular process.

What's interesting (according to Susskind. To me it's not very interesting, because I don't understand it well enough) is that this stuff about the thermodynamics of black holes involves an apparent paradox. General relativity says black holes don't emit radiation. But the black hole has temperature, and so it must emit black body radiation like anything else with temperature. When you look at a process whose characteristic length scale is the Planck length, you have to consider the interactions between relativity and quantum mechanics.

By the end of all this, I thought the calculation was straightforward enough, but I didn't really believe it meant anything. This was a calculation for one idealized process of adding information a bit at a time. But shouldn't the relationship between entropy and mass depend on the procedure for adding the mass? I could dump a bucket of hot water into the black hole, or freeze the water and dump in an ice cube. I'm adding the same mass, but it seems like the entropy I'm adding is very different. Intuitively to me, the radius of the black hole, which depends only on the mass, should increase by the same amount in both cases, but the increase in entropy should be different.

Apparently, this is not so. I have no idea why not, but the explanation came from Hawking, who generalized the simple procedure discussed here. Hawking worked out exactly what the constants should be in the relation between the entropy and area of a black hole, and showed that's it's a general relation. Maybe someday before I die I'll be smart enough to catch a glimmer of how he did it.

I used these wikipedia pages:

http://en.wikipedia.org/wiki/Schwarzschild_radius

http://en.wikipedia.org/wiki/Entropy

http://en.wikipedia.org/wiki/Hawking_radiation

http://en.wikipedia.org/wiki/Planck_length

http://en.wikipedia.org/wiki/Black_hole_thermodynamics

## No comments:

Post a Comment