Quantum Aesthetics

Lee Schroeder

October 31, 1995

 

Introduction

 

There is little doubt that nature, at a fundamental level, is probabilistic in Komogorovs sense. This has been documented by physicists in numerous examples, e.g. tunneling, particle decay, etc. Quantum mechanics accepts principles of probability based on statistical observations of frequencies. However, in an aesthetic approach, it is not necessary to place probability in the underlying mechanisms of the universe. Through the window of Plancks constant, the possibility of an aesthetic universe can be explored.

 

Aesthetics

 

Aesthetics modeling can be best exemplified in the game of chess. Players approach to the game has changed over time from combinational tactics to positional strategy. In the last century, it was common for grandmasters to memorize volumes of combinations to enable recognition and utilization of these repeatable moves thus linking the end goal, checkmate, in time. Over the years, however, with the creativity and brilliance of modern grandmasters, positional strategy has dominated the playing styles. A positional move induces the environment to reveal additional positional and combinational moves. For example, a player not being able to link the present state to the end state, may move so as to establish a flanking attack. This attack, as the player knows, may allow for future opportunities in which a definite path to victory may be realized. A god chess player may put so much value in a positional move so as to sacrifice material value, i.e., a rook or a pawn. A deeper understanding of the positional play necessitates an understanding of the positional parameters. These parameters construct the foundation of the aesthetics model. In Indeterministic Economics, Aron Katsenelinboigen explains the three classes of positional parameters, namely attributes, structural parameters, and characteristic parameters:

 

(monotonousness or traditionality).

 

Positional strategy is most useful in indeterministic situations. In ignorance of the future, one induces the environment, creating potentials, or predispositions to create combinational opportunities at a later time. The exact form of the opportunities at the time of positional play is unknown, else the move would be combinational. This measure of value of the positional move with respect to no particular end, in this case the goal is the combinational move and not checkmate, is analogous to the aesthetics concept of beauty. In the words of Imanual Kant, "beauty is a form of worth of an object which is appreciated without any idea of a goal." Thus, the positional play in chess, an indeterministic system, typifies aesthetics modeling.

 

Nature and Probability

 

Nature, in much the same way as chess, also appears to be indeterministic. The classical physicist is deterministic, believing that if at any point in time the complete universal set of initial conditions, all particles exact position and velocity, could be known, that any state in a any future time could be predicted. However, as the quantum mechanic understands, the world does not work this way. For example, take the phenomenon of beta decay of the muon. Muons are constantly bombarding earths atmosphere and they are also detectable. The proper equipment will amplify two releases of energy associated with each muon. The first represents the kinetic energy release of the stopping muon in the detector. The second is the decay of muon. This time lag in between these events is defined as the decay time and is not always equal. After numerous events (~10,000), the distribution of the decay times appears more or less exponential, with the mean decay time as character. Although the probability distribution pertaining to the next decay muon is known, any particular decay time cannot be predicted.

This inability to synchronize the state in time results in an indeterministic system and thus the dame model used for the indeterministic system of chess could possibly be utilized. For example, consider an electron accelerator. An attribute may be the coordinates of an electron being accelerated. A structural parameter could be the relative speed of two approaching electrons. This would include any relativistic effects of the two electrons frames of reference. Finally, a characteristic parameter might be the relationship the two particles exhibit which results in the forming of various particles from the energy collision.

 

On the Origin of Plancks Constant

 

In the early period of the twentieth century, and after the numerous successes of physics, e.g. Newtons laws of motion, Maxwells unified theory of electromagnetism and Einsteins theory of special and general relativity, major contradictions between experiment and theory persisted. This classic problem requires the idealization of a body which absorbs and radiates electromagnetic radiation perfectly.

Classic theory predicts the power of the radiation to be dependent on wavelength according to the Rayleigh-Jeans law,

 

P(l , T) = 8p kTl -4.

 

However, experiment reveled a distribution of power with respect to the wavelength which did not blow up at high frequency, but actually went to zero. Max Planck, a German physicist, realized that with an hypothesis of quanta, this blackbody power distribution could be understood.

Planck began by assuming a fundamental difference between electrodynamics, a rigorous, complete mathematical theory, and thermodynamics. Since measurements in thermodynamic systems are necessarily mean values, and some calculations including these values could not be properly known through this mean, a new method of approach was acceptable. Planck described a generalized state space for thermodynamic problems, G. He envisioned this space to be compartmentalized into regions of equal state space. The number of dimensions of the space is then equivalent to the sum of the generalized coordinates, call F , and impulse coordinates or moments, call . For example, in describing the state of the momentum of a particle, one would require six coordinates: the three generalized coordinates F 1, F 2, F 3 to represent x, y, and z, and the three moments, in this case velocity, 1, 2, 3 to represent Vx, Vy, and Vz. Therefore, the state space , G, is the sum of all the individual regions dF and d .

 

G = S dF 1 dF 2 dF 3...d 1d 2d 3

 

In each region of state space, Planck assumed one would not be able to measure the exact microscopic details, else electrodynamics could be used. Instead, he considered a uniform distribution throughout each region of state space. These he referred to as the hypothesis of elemental chaos. The next big assumption he made democratized the distribution of state space to each region. Thus, in a two dimensional state space, the are of each element is equivalent and this is accomplished through appropriate limits on each regions integration.

Finally, back to the blackbody radiation problem. Planck modeled this system as a set of harmonic oscillators with a two-dimensional state space including the amplitude of oscillation, f, and the change of the amplitude with respect to time, f, or instead where

 

= Lf, L is a constant

 

The regions of state space are then described as

 

S S dfd = h, h is the constant area of state space regions

 

An elliptical analysis of the state space, dependent on the two coordinates f and is described graphically in figure 1.

 

Figure 1.

An Elliptical Analysis of the State Space

 

 

As represented in the figure, the limits of each region must not be equivalent in order for the areas to be equal. This constant area, h, is Plancks constant, and when matched to experimental data is equal to 6e-34 Js.

It is important to understand the physical meaning of the magnitude of Plancks constant. Along with the gravitational constant, G, and the speed of light, c, h completely scales the size of the universe. Plancks state space, as described above, illuminates this point. The size of a region element is constant, yet of different value for different systems. In the oscillator problem it took on the value of h. Thus, if the state space is increased, the size of the region elements remains constant. Therefore, it is the number of region elements that increase. When the umber of region elements get so great that they seem to blend into each other, then the classical limit is attained and this is the same effect of letting the region areas go to zero to represent a continuous system. In fact, letting the regions go to infinitesimals is exactly the practice of the classical physicist. Planck, however, realized he could predict the blackbody radiation curve by a clever trick of quantizing the region elements in state space, and restricting the integration of the infinitesimal region elements. Therefore, as Planck has shown, the size of the region elements, h, sets the scale from which the classical and quantum domains can be explored.

 

Probability and Plancks Constant

 

Statistical mechanical contribution:

 

Planck justified the equalization of the state space regions with the requirements of probability based on statistics. This ties Plancks constant to the realm of indeterminism. The indeterminism of the black body model, as Planck constructed it, was a priori probability. An example of a priori probability is coin flipping. Consider ten identical coins to be identically randomly flipped, so as to result in an equal probability of heads or tails for each individual event. Rigorous physical calculation of trajectories might be successful in predicting the resulting distribution of heads and tails, however, this is extremely tedious and a probability distribution of different sets of outcomes can be calculated with statistical mechanics. It is intuitively acceptable that there is a better chance for a set of 5 heads and 5 tails than a set of 10 heads and no tails. In fact, there is only one configuration for the latter while the former distribution contains 10!/5!*5! configurations. The ration of these two numbers is the probability that the 5-5 distribution will be realized in the next event versus that of the 10-0 distribution.

Similarly, one could study a 10 particle system, considering the moments of the magnetic spin set in a uniform magnetic field, B, and the energy of the system would be proportional to the difference between the up and down spins, 2s, where s is N/2- the number of spin up particles. In fact, E=2smB.

Assuming the system equalized to particular energy states with probability proportional to the number of configurations which would exhibit the particular energy state, each configuration would have to be counted equally to satisfy the laws of probability. This is precisely why Planck preordained the limits of integration over the coordinates for each region of state space to result in equal elements of space. In this example, there is only one way in which the particles could be aligned to admit an energy equal to 2*5mB. However, there are 10 ways in which the energy could equal 2*4mB. Thus, it is ten times as probable that the state would equalize at an energy of 8mB than at 10mB.

 

Figure 2.

Particle Energy

 

It is important to emphasize the assumption of Max Planck. By requiring the regions of state space to be equal, he coupled his model of thermodynamic phenomenon to probability. This is an important step in understanding the progression of probability into the fundamental structure of modern physics.

 

Quantum contribution:

 

Soon after Plancks theory of quanta, there was evidence of, e.g. the photoelectric effect, and physicists began to believe that quantizing state space was not merely a convenient trick, but a mathematical representation of reality with physical significance. Furthermore, the scale factor of h began to reveal the indeterminacy of nature. As mentioned above, Plancks constant sets the scale at which an experiment must be restricted in order to detect quantum mechanical effects. In other words, the size of state space must be reduced relative to the region element h so that the elements do not blend together to the classical limit. An ideal example of reduction of the degrees of freedom and complexity of state space is the one dimensional particle in the box problem. This quantum mechanical limit allows the observation of natures indeterminism.

Consider a potential well in which the potential energy in the well is zero while the walls are at infinity. Classically, if a particle were to be dropped into this well, after attaining equilibrium, the particle would have equal chances of being found at any position in the well. However, quantum mechanically, after attaining equilibrium, settling to the ground state, there is a better chance of finding the particle in the middle than near the edges. 

 

Figure 3.

Finding the Particle

 

 

It is very important to differentiate this dynamic probability from the a priori probability associated with the coin flipping problem. Whereas the a priori probability seems to have a reasonable underlying mechanism, there seems to be no such analog to dynamic probability. To further explore this concept, the origin of the distribution must be sought.

The probability distribution is just the square of the solution to Schrodingers equation. Schrodingers equation is simply a creation, a guess, that happens to predict future events very effectively. Since one cannot derive Schrodingers equation, and since it appears to be the origin of the probability, perhaps the underling mechanisms describing this probability distribution either cannot be found or do not exist.

There is another establishment of quantum mechanics, through Heisenberg matrix mechanics (Schrodinger did show the relation of the two methods there is a matrix analog to Schrodingers equation.). Perhaps an insight into the mechanisms behind quantum mechanical probability can be found here. There are two ways to perceive the postulation of matrix mechanics. The first is the guess, as Heisenberg did, of the equation

 

[X,P] = ih

 

which holds as much insight into the probability distribution problem as does Schrodingers equation. The second however has to do with the Heisenberg uncertainty principle. The above equation with a few steps of mathematics leads directly to Heisenbergs uncertainty,

 


D x * D p h/2

 

This coupling of Heisenbergs postulated equation with the uncertainty principle suggests that the foundation of quantum physics may be found in this uncertainty relationship. This assumption would explain the probability distribution of the particle in the well; namely through Schrodingers or Heisenbergs equations. However, the deeper foundation or postulation then gets pushed back to the uncertainty principle.

It is very difficult to accept the lack of classical underlying mechanisms describing the dynamic probability distributions found in quantum physics, in contrast to the a priori probability of statistical mechanics. Therefore, a discussion of the probability distribution of muon decay, being that it is among the most completely understood quantum mechanical distributions, is warranted.

Experimentally, the distribution is exponential in form, except for an initial ascent.

 

Figure 4.

Muon Decay

It is one of the great triumphs of modern physics to calculate from first principles, the mean decay time for the muon and other weak interaction particles. This theoretical decay time matches excellently with experiment. Perhaps then, all the probabilities and indeterminancies found in quantum mechanics could similarly be explained. This explanation would ease the heart of those who, along with Albert Einstein, do not want to believe that "God plays dice." Unfortunately, this prediction of the mean lifetime does not necessarily imply the nature of the probability. For instance, why is there a probability distribution for the lifetime of the muon in the first place? There are, however, qualitative attempts to understand the fundamentals of the probability utilizing the relation

 


D E * D t = h/2.

 

It must be emphasized that this relationship is not of the same type as Heisenbergs uncertainty. First, there is no uncertainty on time; it is known exactly to infinite precision. Second, whereas Heisenbergs uncertainty follows from implications of mathematics, due to the existence of operators, X and P, the above relationship is qualitatively inferred. Consider a two state system of an NH3 atom which can be in either an up or down state. Schrodingers equation, or equivalently matrix mechanics, predicts oscillation between the two states. For any given unit time, there is a probability that the NH3 atom will oscillate to the other state. The unit time in question is D t and to be sure that the NH3 state has changed, there must be significant change in its probability amplitude. This is achieved when D t is such that 2D ED t = h or in other words is of order unity. Finally,D t is the decay time for the muon, implying a probability distribution with a characteristic mean lifetime. This particular relationship is just a result of the probability amplitude equation for the system state. However, with simple algebraic manipulation, this converts to the energy time uncertainty relationship.

Even though the energy time uncertainty gives some qualitative insight into the probability distribution for the decay of the muon, it itself is postulated. Even if on the other hand, it were derived, it would still leave the origin of probability in the hands of an uncertainty principle. Therefore, muon decay can be considered fundamentally understood from either Schrodingers equation and matrix mechanics or the uncertainty principle. As discussed earlier, there seems to be an intrinsic coupling of matrix mechanics and the uncertainty principle which implies Schrodingers equation. Both need examination for insight into the infrastructure behind probability in quantum mechanics.

The Aesthetic Model

 

As discussed in the previous section, there is opportunity for aesthetics modeling to anchor at two points: either the equations postulated by Schrodinger and Heisenberg or Heisenbergs uncertainty principle. Both possibilities will be treated.

Anchoring at Schrodingers or Heisenbergs equation means an anchoring at the resulting solutions to these equations which when squared represent probability distributions. Once again, consider a universe based not on combinations, deterministic law, but on positional parameters. There would be potentials or predispositions for certain events to occur. In the same way the positional play of a chess grandmaster may not be equivalent in different but identical games, the positional parameters of nature are capable of admitting different combinational pathways in repeatable identical experiments. This is observable in repeated physics experiments such as the muon decay. The decay of the muon will most likely be different in every identical experiment. Aesthetics and predispositions could help explain the assumed probabilistic mechanisms beneath nature. However, there is still the problem of the continuous probability distributions predicted by quantum mechanics and verified in experiment. Certainly, any theory attempting to explain the apparent probabilistic aspects of nature would have to take this experimental evidence into account. Predispositions would admits finite number of pathways, thereby not resulting in a continuous distribution of events. But of course, there is not necessarily a low limit. Suppose the number of positional parameters were so great and the proliferation of combinations from each were so numerous that the distribution of pathways appeared to be continuous. This distribution could be successfully modeled as random an probabilistic (quantum mechanics) or as the representation of the various combinational pathways (aesthetics).

In this light, Plancks constant once again seems to be the scale factor at which nature reveals its indeterminism. Therefore, Plancks constant could be imagined as a manifestation of relationships between the positional parameters of the universe. This manifestation suggests to the experimenter at which scale the indeterminism of natures underlying predispositions might be noticeable. At too great a scale with respect to Plancks constant, only the most likely pathways will be realized. Once again, consider the repeatable game situation of the grandmaster. Just like the standard deviation in measurement decreases with additional measurements, the diversity of moves will most likely be greater in consideration of the first few games than in consideration of a large number. In macroscopic situations, the averages will wipe out the possibility of the less likely moves. This of course is represented in classical physics.

On the other hand, the anchoring in the aesthetic model to the uncertainty principle is intuitively pleasing. An aesthetics foundation to physical law is inherently indeterministic. It is not a great feat to imagine the coupling.

So where is Occums razor? Why is all of this important if quantum mechanics already matches so wonderfully with experiment? The problem is the foundation of probability built into quantum mechanics. Physicists accept the probabilistic nature of the position of a particle in the well and they not question the inability to predict the exact time of the next muon decay. This probability has become ingrained in the scientific community as reality. Aesthetics modeling, however, opens the possibility for a collapse of probability based on frequencies and this is its justification.

Considering a universe founded on predispositions, in order to understand the next muon decay the aesthetician would need to have an intuitive understanding of the positional parameters. Through enlightened positional play and an arrival at a final set of combinational moves, the aesthetician could finally choose a pathway to link the present state to the end goal, perhaps a particular muon decay time. The most difficult part, of course, would be the development of the aesthetics modeling.

 

 

 

Selected Sources

 

"Aesthetics", The New Encyclopedia Britannica, vol.1,pp.149-163.

 

Katsenelinboigen, Aron. Indeterministic Economics. New York: Praeger, 1992.

 

Planck, Max. Theory of Heat Radiation. New York: Dover, 1959.

 

Ecology

Back to Science

Back to Title Page