Search
Search
Boltzmann Brains and Epistemology
Boltzmann Brains and Epistemology

Date

author

share

Entropy

Entropy is one of the most interesting concepts in physics. It has sometimes been defined as the amount of disorder in a system or our lack of knowledge of the system. It is remarkable that entropy, a technical concept ubiquitous in physics equations, can be described in such non-scientific ways. Because of its high level of intuitive meaning, it has inspired quite a few flawed arguments. One notable argument is the theory of Boltzmann Brains, which claims that all our memories and experiences are false byproducts of random chance that will cease in an instant. While this argument is scientifically sound, it makes an epistemological error, meaning that it does not correctly understand how we know things. The refutation of this argument points to the significance of conscious experience and the importance and primacy of philosophy in understanding science.

The multiplicity is the number of indistinguishable possibilities that could cause the results we observe. 

Entropy can be calculated with a concept called multiplicity. The multiplicity is the number of indistinguishable possibilities that could cause the results we observe. For instance, consider a friend who tells you that she rolled two dice, a red one and a green one, and added the numbers together. If your friend tells you that she got a four, there are three possible ways in which this could happen: the red die has a three and the green has a one, the green die has a three and the red has a one, or both have twos. In this case, the multiplicity is 3. In contrast, if your friend tells you she got a two, there is only one possibility, namely both dice have a one. Thus, the multiplicity would be only 1.

From this concept, one can see why entropy is sometimes called our lack of information about a system. When the number given has low multiplicity, for instance the number 2, you know the exact value of each die. On the other hand, if there is high multiplicity, you know very little about each individual die. Entropy tells you how little information your friend’s number gives you about the value of the dice.

To see why entropy is called disorder, it helps to consider a toddler’s house with a fenced off corner for toys. When the house is orderly, all the toys are in the pen. If you know all the toys are in the pen, you have a good idea of where each one is. There are not many possibilities for the positions of the toys, thus the system is in a low-entropy state. On the other hand, if the toddler is allowed to run wild, picking up toys and randomly carrying them until he feels the urge to drop them, the toys will eventually become spread throughout the whole room. In that case, the system would be in a high entropy state. This is why entropy is associated with disorder.

One of the most fundamental results in all of physics is the second law of thermodynamics which says that the entropy of a closed system can either increase or stay the same but not decrease. This …

Originally appeared on Daily Philosophy Read More

More
articles

More
news

What is Disagreement?

What is Disagreement?

This is Part 1 of a 4-part series on the academic, and specifically philosophical study of disagreement. In this series...