THE ELEMENTS OF PHYSICS
postulated is equal to the number of resulting conservation laws, and one
might then as well postulate these laws directly. Nevertheless, Noether's
theorem provided the impetus for a large number or ingenious attempts to
link the other conservative laws with other symmetry operations, not
necessarily confined to operations in space and time. For example, in
electromagnetic theory one can perform gauge transformations of the
potentials which leave the electromagnetic field invariant; a number of
authors have attempted to understand the law of conservation of electrie
charge as consequence of this peculiar symmetry of the electromagnetic
In recent years considerable attention has been paid to discontinuous
symmetry operations, such as reflections or inversions of coordinates, which
cannot, in principle, evolve continuously from the identity operation. They
are, therefore, not within the domain of Noether's theorem. Some of these,
such as the symmetry under inversion of coordinates, which is violated by
weak interactions (see Chapter 16), can be understood as origin of conser-
vation laws, such as the law of conservation of parity, which is violated hy
weak interactions. Others, such as the symmetry under reversal of motion
(see Chapter 6), do not lead to introduction of a conserved quantum number,
such as parity, because of formal reasons whose physical significance is ill
understood at present. Thus there arc symmetries that do not lead directly
to any conservation law, and there are universally valid conservation laws,
such as the conservation of B, L, £,„, whose link with symmetries, if there is
any, remains obscure.
Additional Suggested Reading
Feinbf.ro, C, and M, Goldhabfr, "Conservation Laws," Scientific American,
October 1963, 36-45.
McCauley, G. P. "Parity," American Journal of Physics 29 (1961), 173-181.
Wicner, E. 1\ "Violations of Symmetry in Physics," Scientific American,
APPENDIX O N F
The Probability Concept in Physics
One of the principal aims of theoretical physics is the development of
mathematical techniques enabling one to predict the outcome of experiments.
A notable example is Newtonian mechanics, which consists of a set of "laws"
enabling one to predict the future position and velocity of an object under
the influence of some given force, provided one knows the initial conditions,
i.e., has certain knowledge about both position and velocity of the body at
some initial instant of time. Thus Newton's laws of motion together with
Newton's law of gravitation have enabled theoretical astronomers to compute
tables of the future positions of all planets revolving around the sun on the
basis of past observations of position and velocity of these objects.
However, in many branches of physics one encounters objects and
situations whose initial conditions are partially or wholly unknown. This
ignorance about the initial conditions may have various origins. In most
macroscopic cases of this kind it stems from the complexity of the object,
making an exhaustive determination of the initial conditions of all us parts a
For example, application or Newtonian mechanics to the atmosphere for
the purpose of weather prediction suffers from the impossibility of obtaining
complete information about the meteorological state of the atmosphere at
any given initial instant of time. Fortunately, the meteorologist is not con-
demned to complete frustration by his limited knowledge of present weather
conditions, became he has records, and on the basis of past records he can
resort to probability statements about the future weather.
This possibility of weather prediction in probabilistic terms extends to
similar predictions of a very primitive kind. For example, when an old
resident in a district looks at the sky, sniffs the air, and then pronounces,
"It is probably going to rain tomorrow," what he really means to say, if
he is at all reliable in his observations, is that, whenever in the past sky and
THE ELEMENTS OF PHYSICS
air appeared similar to their present appearance, then more often than not
rain would fall on the following day. The prognosis is thus made on the
basis of a past record of the relative frequency of rain following from a given
set of similar weather conditions.
The meteorologist, who has quantitative records, can couch his prediction
of tomorrow's weather on the basis of today's weather map in a more
quantitative form by saying. "There is a 70 percent probability for rain
falling tomorrow/' What he means to convey by this statement is that, in
100 past eases of weather maps resembling today's map within the limits
imposed hv incomplete knowledge of the present weather conditions, rain was
recorded in 70 cases on the following day.
It is important to realize that probability statements can be meaningful
even though the surmised cause of future behavior may be based on a complete
misunderstanding of the actual course of events. For example, if one of the
Roman harusplces, who claimed to be able to foretell the future from
examination of the entrails of slaughtered animals, had kept records of how
often finding the liver of a slaughtered Iamb on the left was followed by rain
on the next morning, he would have been able to say, quite justifiably,
"There is a 70 percent probability for rain falling on the morning after finding
a lamb's liver on the left." The modern meteorologist will, of course, point
out that according to his records morning rain falls in Rome on 70 out of
100 days picked at random, and if the haruspex had been very clever he might
have arrived at the same conclusion if he had observed that finding the
lamb's liver on the right was also followed by rain on the next morning in
70 out of 100 cases.
The modern meteorologist's confidence in the dependence of tomorrow's
weather on today's weather, rather than on today's position of an animal's
entrails, rests on his ability to improve the degree of certainty with which he
can make his predictions by introducing more and more detailed features
into the classification of initial weather conditions. Indeed, if Newtonian
mechanics were sufficient to grasp all physical processes contributing to
making weather what it is, then in the limit of complete knowledge about an
initial state of the weather one should, in principle, be able to compute
weather maps for all future times similar to the tables with the future
positions of planets available to astronomers.
This perfect predictability of nature from given complete initial conditions
within the framework of Newtonian mechanics seduced many physicists,
particularly during the nineteenth century, into extrapolating it into all
branches of physics. From their point of view any recourse to probability
statements was a somewhat distasteful necessity imposed by nothing else
but the limits of human capacity to record and to compute.
Since the turn of the last century evidence has turned up. notably in atomic
physics, that has led most contemporary physicists to accept the existence of
"laws of nature" whose very formulation requires a probability statement.
A verv striking example is the law governing the radioactive decay of certain
atomic nuclei. Hxperimentation with polonium atoms shows that any
polonium nucleus picked up and put under observation at any given instant
of time will with a probability of 50 percent not survive the next 140 days.
Another way of describing this fact is to say that "the most probable
behavior of a given initial number of polonium nuclei is the decay of half
their number during the next 140 days" or, in jargonese, that "polonium
has a half-lire of 140 days." Now what is remarkable is the apparent Im-
possibility to improve on this prediction of the future behavior of polonium
nuclei. The atomic physicist is not in the position of the weatherman; he
cannot discern on any polonium nucleus any feature marking this particular
nucleus for decay during the next 140 days with a probability of either more
or less than 50 percent.
For honesty's sake it should be pointed out here that the "relative fre-
quency" definition of probability adopted by most physicists has a weakness
that has caused much thought, particularly among mathematicians, ft is
virtually impossible to prescribe clearly how to distinguish a correlation from
a fluctuation. For example, if someone observes half of a given small number
of polonium atoms decaying within 138 days, he would classify this as a
confirmation or the probabilistic law of decay on the grounds that the
deviation from the average half-fife of 140 days can be understood as a
fluctuation typical for processes governed by probabilistic laws. On the other
'hand, if a half-life of only 12 days (say) were observed in one particular
experiment, then the observer would tend to classify this as a violation of the
probabilistic law and try to understand it as the result of a correlation with
some as yet unknown cause for this large deviation from average behavior,
because a fluctuation of that amount, although possible in principle, is in
itself a highly improbable event. It is hard to find an unambiguous criterion
for the point at which to draw the line between observations admissible and
acceptable as fluctuations and observations indicating the presence of a
correlation. Most workers in this field adopt some rule of thumb based on
common sense. An example of such a rule, due to Karl Friedrich Gauss
(1777-1855), will be given farther down.
When one tries to apply Newtonian mechanics to the atomic domain for
the purpose of describing and predicting the motion of elementary particles,
one is once again confronted with the necessity of employing a probabilistic
language because of an apparent impossibility, in principle, to know simul-
taneously, with arbitrary accuracy, the initial position and velocity of a particle.
The uncertainty Ax in the knowledge of the position and the uncertainty
At; in the simultaneous knowledge of the velocity of a particle of mass m is
apparently governed by an "uncertainty relation" discovered by W, Heisen-
berg in 1925,
AxAv ;a — ; 1) sa 1 '" erg sec. ( A 1 . 1 )
THF. ELEMENTS Of PHYSICS
Thus any improvement in the localization of an object, i.e.. any decrease in
the uncertainty M, is purchased at the price of an increase in one's ignorance
of that object's velocity Ay.
Fortunately, this uncertainty relation does not invalidate the applicability of
Newtonian mechanics for practical purposes in astronomy or most branches
of macroscopic physics. For example, suppose one could localize the position
of an artificial satellite of mass m = 10 :i kg = 10 6 g to within one atomic
diameter, Ax .=a 10 * cm. This would engender an uncertainty in the knowl-
edge of the velocity of the satellite, which could not be compensated in
principle, of amount
h 10" 27
&x 10 s x lCr
Even if the satellite were to travel for a time t m 10 1 ' sec of the order of the
age of the universe, the astronomer's unavoidable uncertainty in the pre-
diction of the satellite's future position would amount to not more than
Ax sa At.'/ rb 10"" X 10 u s=w 10 s cm, i.e., the satellite would at. the end of
that time still be within two atomic diameters of the position predicted by
On the other hand, if one localizes an elementary particle of mass m to
within a radius of
r ~ -
3 x 10'" cm sec * = velocity of light,
which, for example, for an electron of mass m ra I0" 27 g is of order r sv
10- u cm, one engenders a praetically complete ignorance in the velocity of
that particle, because under these circumstances A?' ^ c. Thus, the smaller
the mass of a particle, the more elusive it becomes if one wishes to localize it.
At the present time a modification of Newtonian mechanics, known as
quantum mechanics, has gained wide acceptance for the description of atomic
phenomena. It consists of a mathematical formalism allowing one to make
predictions of future probability distributions if an initial probability
distribution is given.
The meaning of the word "probability distribution" can perhaps most
easily be clarified by an example based on the special case of a "dichotomic"
or "either-or" attribute observable on an object. For example, the faces of
a die have the dichotomic attribute of showing either an even or an odd
number. If the die is "honest" and not loaded, the probability for an even
number's turning up in a throw of the die is 50 percent or 1/2,
If one now plays a game consisting of 10 throws of a die, with the under-
standing that the result of each throw is independent of the result or the
preceding throw (the gambler makes sure of this between throws by rattling
the die around in his hand), then one expects as most probable outcome of
the game an even number to turn up 5 times. This does not mean in the actual
game even numbers will necessarily turn up 5 times. There are always
fluctuations. For example, the outcome of a series of 100 such games is
recorded in Table Al.l and plotted in Figure Al.l.
It is instructive to compare this actual outcome with the most probable
outcome one would expect on the basis of a theory of probability that is
particularly easy to develop for this special case. The first step consists of
Record of a set Of 100 games consisting of 10 throws of a die each.
Number of limes an
even number turns
up in 10 throws
Total 1 00
realizing that if one plays a game with N throws then there are 2' s different
possible sequences in which even (H) or odd (O) numbers may turn up.
For example, if one makes 3 throws (see Table A1.2) one has the 8 possible
sequences EEE, EEO, EOE, OEF, OOE, OEO, EOO, OOO.
Now if in each individual throw the probability for an even number to
turn up is equal to the probability for an odd number to turn up (namely, £l,
then every sequence has the same probability, |* .
In the example of 3 throws, each sequence occurs with the probability 1/8.
However, when recording the outcome of the game as is done in Table Al.l,
one is only interested in the total number n of an even number turning up in
N throws, independent of the order in which the even numbers turned up in
Thus, in the example with N = 3, there are 3 sequences EEO, EOE, OEE,
which yield the same result (namely, n = 2), whereas only I sequence (namely,
EEE) yields the result n . = 3. One therefore expects the result n — 2 to be
THE ELEMENTS OF PHYSICS
__, fc ,_ Probability
1 2 3 4 5 6 1 8 9 10
Graphic comparison of the actual record of 100 games of 10 throws each of a die,
compared with the probability distribution for that game; g is the number of
games in 100 that had the result n (i.e., in which an even number turned up n times
in 10 throws).
'["able A 1.2
Analysis of a game consisting of 3 throws of a die.
three times as probable as the result n = 3, as recorded in the third column
of Table A1.2.
Quite generally, the probability for a particular result will be proportional
to the number of different sequences yielding that same result. Computing the
probability Tor a given result is thus equivalent to counting the number of
possible sequences through which one can arrive at that result.
1 2 1
13 3 1
• * • *
14 6 4 1
t ■ • * *
I. 5 10 10 5 1
1 Ci 1 5 20 1 5 6 I
1 7 21 35 35 21 7 1
1 8 28 56 70 56 28 8 1
1 9 36 M m 126 84 36 9 I
1 10 45 120 210 252 210 120 45 10 1
»*•«*»*•• • *
Figure A 1.2
Each peg in a chance board is labelled with the number of possible ways in w ; hich a
steel ball ean reach it. The result is equivalent to Pascal's triangle for the binomial
coefficients. The probability for arrival at a peg in the 10th rung is obtained by
dividing the binomial coefficient with the total number of paths leading to the 10th
rung, namely, 2 10 = 1024. Thus the probability for arrival at the 3rd peg from the
left is 45/1024 or 4.4. These probabilities are recorded in the third column of
Table Al.I and represented graphically by the broken line in Figure Al.I.
This number Tor the result n in A'' throws is equal to the binomial coefficient
b ri = /V!/[n! (A" — »)JJ, and one arrives at the probability p(n) for the result
k by dividing b, : with the total number of possible sequences, 2 : \
The fact that this number involves the binomial coefficient can be made
visible by a gadget known as "chance board," which in fact plays a completely
analogous game automatically, It consists of pegs arranged in triangular
fashion on a board, as drawn in Figure A 1.2, such that when a small steel
THE ELEMENTS OP PHYSICS
ball hits a peg from above, there is a 50 percent probability for the steel ball
to be deflected either toward a peg one rung below on the left or toward
another peg one rung below on the right. Thus the number of ways in which
one peg in the Nth rung can be reached is equal to the sum of the ways in
which the two pegs just above it in the (jV — I)th rung can be reached. This
is just the property which defines the binomial coefficients, and the chance
board is a physical representation of Pascal's triangle.
The analog of a set of 100 games with 10 throws each of a die consists,
then, of letting 100 steel balls Tali through the maze of the chance board and
collecting them in receptacles, one below each peg in the 10th rung. Front
Figure A1.2 one can read the probability for arrival of a steel ball at each
peg ii, namely a%i = ®X iJsb = U0, iah = 4.4, etc.. and these proba-
bilities are recorded in the third column of Table Al.l. Their graphic
representation by the broken line in Figure Al.l is called the "probability-
distribution" for that game.
Comparison with the actual distribution shows a deviation, which is
interpreted as a fluctuation. Such fluctuations are typical and arc an essential
feature of processes governed by a probabilistic law.
Now if these deviations are either very large or arc not "random" but occur
repeatedly in a certain way, for example if the actual outcome for n = 5 is
always less than the expected value 24.6, one will begin to suspect the presence
of some as yet unknown cause that k 'loads ,: the game, and under these
circumstances one reclassifies the deviations as "correlations" As was
mentioned earlier, it is difficult to decide where one should draw the line
between admissible fluctuations and suspected correlations.
The mathematician Gauss gave the following advice, based on long
experience with probabilistic behavior. Denote by rip n.,, etc. the outcome of
the first, second, etc. game, and suppose the total number of games played
is G. Compute the average of «, denoted n and defined as
«i + n, + ■ ■ ■ + n-Q
(In the example recorded in Table Al.l one finds n = 4.97.) Then compute
the deviation s whose square is defined as
(«! — nf + (n 2 - ii) +
(A 1. 6)
(In the example above one finds s- = 2.57 so that s — 1.6.) According to
Gauss one should suspect the presence of a correlation if there are any games
with a result outside the range ii — is and n + 3s.
This criterion applied to the example above means that one should have
suspected a correlation if any game had turned up an even number times or
10 times. According to the second column of Table Al.l no such outcome
was observed, and thus the actual distribution is consistent with an inter-
pretation that classifies the deviation from the probability distribution as a
Additional Suqcjested Reading . . —
K a c , M . "Probability,' ' Scientific Amsriemt, September 1964, 9Z-1 08,