Beichler
Submitted for publication in WISE Journal
Spring 2016
EPR-b
Bubble, Bubble, Toil and Trouble
by James E. Beichler
This paper is excerpted from a prepublication copy of my upcoming history book
Phallacies in Fysics: Volume 2
Modern Physics: A fresh look and reappraisal of the Second Scientific Revolution
Einstein and his colleagues developed valid arguments against the quantum theory in the EPR
paper of 1935 that have never been successfully or completely answered by science or philosophers.
Through the gross misunderstanding of those arguments, experimental tests of spin and polarity
entanglement have been used in attempts to convince scientists that EPR was wrong, but it is not.
These experiments do not answer or even address the central issue raised by EPR and no valid test
of the EPR paradox has ever been developed. EPR was about the simple mechanics of motion, not
about electromagnetism or purely quantum effects. But Einstein and his colleagues were also wrong.
They argued against the discrete nature of reality and the particle worldview, neither of which really
address the real weaknesses of the quantum theory and the Copenhagen Interpretation. If they had
taken a different approach and sought compatibility between relativity and the quantum they would
have been able to demonstrate that the quantum theory does not depend on either a discrete view of
nature or the particle worldview.
Both relativity and the quantum favor neither discreteness nor extended particles alone, so
debates whether nature is deterministic or indeterministic are both misleading and irrelevant to the
advance of physics. The whole issue can be settled by finding the physical conditions for
compatibility and the material consequences for fulfilling those conditions. In this respect, the
Heisenberg uncertainty principle can be interpreted in a manner that is not at odds with relativity
theory and Planck’s constant can be defined as the minimum measurable limit for which the physical
reality of space and time can be taken as separate thus rendering Planck’s constant a space-time
coupling factor. By taking this interpretation of the quantum into account the unification of physics
within a single paradigm cannot be not far behind while the emphasis of the history of physics on
the revolutionary nature of the quantum and the overthrow of Newtonian physics becomes less of a
factor than previously thought. In fact, Newtonian physics was so successful by the beginning of the
twentieth century that it fostered the new revolution with Newtonian relativity of space and time
falling to Einstein’s relativity of space-time at the extremes of speed and gravitational mass and
Newtonian absolute space and time falling to quantum space-time at the lower extreme of physical
size where the geometry of points becomes physically relevant.
I. A fresh look at relativity, uncertainty and compatibility
Introduction
1
Beichler
Submitted for publication in WISE Journal
Spring 2016
Reality is in the eyes, and perhaps the mind, of the beholder. What we see or otherwise sense in
the world around us is what we as sentient beings accept as reality. However, our sensual
perceptions are modified by our minds and training. An artist painting a picture portrays his
individual sense of reality while a musician composing a symphony describes reality and the
harmony of the universe after his own manner. Theologians, philosophers, biologists and other
thinking individuals may wish to interpret the physical universe of experience and the reality that it
represents in still other terms. But the physicist claims the right to discover the nature of physical
reality within the limits of the human mind’s ability to perceive the world about us. While other
disciplines offer views or opinions of reality from their own perspectives, physics claims to seek a
universal perspective, which is as free from bias and opinion as is humanly possible. Yet, in all of
these attempts there seems to be a fundamental assumption that there is some basic form of reality
beyond the mind’s perception of it, and this notion bears investigation.
Two fundamental views of reality dominate the modern scientific community of physicists and
they have been at odds with each other since their initial discovery. In the first view, it is held that
physical reality consists of a space-time continuum as represented by the special and general theories
of relativity. Relativity presupposes that exact measurements can be made on the quantities it
describes. In the other view it is held that there is an inherent uncertainty in our measurement of
specific quantities in nature and we can never determine the exact nature of our physical world. In
other words, science is limited and can only go so far in its quest to know and understand reality.
This second view is based upon the Heisenberg uncertainty principle in any one of its several forms.
However, the discrete view of reality posited by the quantum theorists contradicts the continuous
view supported by relativity even though science investigates one and only one reality.
Philosophical interpretations of the uncertainty principle range from one extreme whereby the
existence of reality is dependent upon some type of conscious perception or human observation to a
conviction that the uncertainty is due to the act of measurement itself. This second extreme implies
that there is a reality beyond human observation and that it might be possible to go beyond the
inherent limits set by the uncertainty principle to discover the true nature of reality. Yet in either
interpretation, the uncertainty principle places limits on our scientific knowledge of reality and
nature is thus rendered indeterministic. This proposition is opposed by relativity theory, which
places no such limits on our knowledge of physical reality and thus assumes that nature is inherently
deterministic. The older classical Newtonian worldview that these two newer worldviews
purportedly overthrew was also deterministic so relativity and Newtonian physics are grouped
together and categorized as classical.
The philosophical basis of this schism of belief was best summarized in what has become
known as the EPR paradox. Einstein, Boris Podolsky and Nathan Rosen published a paper in 1935
that challenged the Copenhagen Interpretation of quantum theory and the probabilistic
interpretation of the uncertainty principle. EPR claimed that quantum theory was incomplete and it
was at least possible to infer both the position and momentum of a material object simultaneously at
the present level of scientific understanding even if they could not be directly measured, implying
the existence of an underlying reality even though science could not yet determine its nature. So the
EPR argument involved an underlying reality independent of the act of observation and the
existence of human consciousness. The feasibility of later scientific advancements, which would lead
2
Beichler
Submitted for publication in WISE Journal
Spring 2016
to a better understanding of the problem and true simultaneous measurement of position and
momentum, was also implied in the EPR argument.
Bohr and Heisenberg supported the other extreme of interpretation of uncertainty, known as
the Copenhagen Interpretation. These physicists believed that the uncertainty principle placed limits
upon our knowledge of physical reality since we could never measure position and momentum
simultaneously. The problem of exact simultaneous measurement was not in the instruments of
measurement and could thus not be overcome by using more refined instruments or theories
representing further advances in physics. So quantum theory as represented by the uncertainty
principle was thought complete by Bohr and Heisenberg, but incomplete by Einstein. The problem
of measurement was in nature itself and thus a fundamental property of reality.
Bohr’s reaction to Einstein’s accusations of incompleteness was immediate, but the debate has
continued to a greater or lesser extent over the ensuing decades with the majority of scientists siding
with Bohr. To this date, the argument has not been resolved to the satisfaction of all physicists,
philosophers and scholars, yet the failure to settle the argument once and for all has not proved to
be a hindrance to the research, advances and applications of the quantum concept. On the contrary,
the philosophical debates over this issue have encouraged the design and implementation of new
experiments to test these viewpoints, or so it has been thought. One important interpretation of
EPR that has had a stimulating influence on experimental physics was the proposal that unknown or
as yet undiscovered hidden variables are missing from the present quantum theories. The discovery
of these hidden variables could render a more complete description of nature than could be afforded
by the uncertainty principle alone. A great deal of literature, representing both philosophical and
physical researches, has accumulated that discusses the foundations of various hidden variable
interpretations of the EPR paradox.
The wealth of literature on this approach has been centered on the work of John von
Neumann, David Bohm and John S. Bell. In 1932, John von Neumann offered a mathematical
proof that allowed him to decide between classical determinism and the non-determinism inherent
in the quantum theory. He proposed a theorem according to which quantum mechanics could not
possibly be derived by statistical approximation from a deterministic theory of the type used in
classical mechanics. His theorem was quite influential for more than two decades before it was
found to contain a conceptual error by John S. Bell. In the meantime, David Bohm questioned the
relevance of von Neumann’s proof in 1952 and formally introduced the concept of (non-local)
‘hidden variables’ that had only been implied in the EPR paper, reopening the debate. Bohm’s work
set the stage for Bell’s 1964 discovery of the conceptual error. Bell argued that von Neumann’s
proof and similar analyses left the “real questions” presented by the hidden variable approach
untouched.
Bell’s theorem drew a distinction between the local and non-local features of uncertainty.
According to Bell, hidden variables that only affect nature in the immediate vicinity of an event
could produce observable results that contradicted the predictions of quantum mechanics. However,
the substratum of space and time does not necessarily depend upon such local hidden variables.
Instead, other hidden variables, which acted non-locally and thus affect events over a much greater
range, could be the contributing factors of our deterministic world. When such non-local parameters
3
Beichler
Submitted for publication in WISE Journal
Spring 2016
vary, they affect the overall substructure of the world. Bell’s theorem suggests that experiments to
distinguish between local and non-local properties governed by quantum mechanics are feasible.
Relativity theory implies that there are no ‘localizable’ particles of matter that conform to Bell’s
model without affecting the substructure of our world. However, recent experiments designed to
test for ‘Bell’s inequalities’ seem to support the Copenhagen Interpretation in contradiction to
relativity theory and the possibility of hidden variables that conform to Bell’s theorem in the eyes of
some scientists. Different entities within quantum mechanical systems seem to ‘communicate’ across
space and time, and thus act non-locally, defying the model of space-time which represents relativity
theory. It would seem that Bell’s argument suffers by its strict limitation of EPR to a hidden variable
interpretation as well as the existence or non-existence of ‘localizable’ particles of matter. By
concentrating on particles themselves, Bell has already assumed the ‘discrete’ nature of reality and
runs the risk of only confirming his own initial assumptions.
In Bell’s own words, EPR implied the existence of hidden variables and other interpretations of
EPR were not taken into account.
The paradox of Einstein, Podolsky and Rosen was advanced as an argument that
quantum mechanics could not be a complete theory but should be supplemented by
additional variables. These additional variables were to restore to the theory causality
and locality. In this note, that idea will be formulated mathematically and shown to
be incompatible with the statistical predictions of quantum mechanics. (Bell [1964]
1987, 14)
In this statement the word ‘incompatible’ stands out for its lack of definition while its use implies
that uncertainty and relativity should be compatible for a ‘complete’ description of nature.
Compatibility among the various laws of physics must be considered a desirable quality.
For example, the uncertainty principle is compatible with Newtonian mechanics since the
results of quantum mechanics approach Newtonian mechanics in the proper domain of application.
This property of the theory was termed ‘complementarity’ by Bohr. Special relativity is also
compatible with Newtonian results when speeds are slow compared to the speed of light and general
relativity does not differ appreciably from Newtonian gravitation when dealing with all but the
strongest gravitational fields. These compatibilities could be demonstrated by reducing the equations
of one theory to the other within the appropriate mathematical and physical limits.
With all of these well-documented examples of compatibility, it would seem that there would be
some measure of compatibility between relativity theory and the uncertainty principle even though
they seem to prescribe different fundamental notions of reality. If both theories are true as
experiments and observations have shown them to be, there must be relative forms of quantum
theory that accurately describe physical phenomena and vice versa. Theoreticians have developed
quantum models such as quantum field theory (QFT) that take into account relativistic limits and
conditions, but fundamental differences between these two areas of physics still remain in spite of
such successes, which suggests that the suggested compatibility between relativity and uncertainty
must exist at some deeper and more fundamental level of reality.
Desperately seeking compatibility
4
Beichler
Submitted for publication in WISE Journal
Spring 2016
Establishing some form of compatibility would seem to be an important step toward an
ultimate unification of relativity and uncertainty, yet the extent to which the relativity and quantum
theories are compatible has never been determined and the conditions of compatibility have never
been established since relativity and the quantum have been incorrectly assumed mutually
incompatible. Under these conditions, a common philosophical meeting ground for the desired
unification of uncertainty and relativity will never be found unless some criteria for their
compatibility can be determined. At the most basic level, it would be necessary for any criteria to
take into account the fact that different theories must yield the same physical results in areas of
common application. Theories that explain physical reality in different realms of nature must give
the same results in areas where the domains of their explanation overlap. The abstraction of this
notion leads to the possibility that different theories are compatible if the mathematical formulas by
which those theories model nature can be derived from each other. Such derivations not only
represent a criterion for compatibility but also offer a measure of the degree to which different
theories are compatible by establishing the conditions of compatibility.
Special relativity and quantum mechanics share a common realm in the microscopic world of
elementary particles. Therefore, it would seem that the basic formulas of relativity should be
derivable from the uncertainty relationships and uncertainty should be derivable from the properties
of relative space and time. In spite of the obvious similarities between the theories, this task has not
been accomplished and possibly never attempted. Yet clues to the method of deriving relativistic
formulas from the uncertainty relationships are readily suggested by a close examination of the
physical quantities found in the various statements of the uncertainty principle. The better-known
relationship exhibits the variables position in space and momentum, which can be defined as
material change of position in space.
The second relationship refers to position in time and energy.
In both cases, the older Heisenberg forms using ‘h’ for Planck’s constant are used for the sake of
simplicity.
When the quantities in these two relationships are compared to one another, one may suspect
that energy is related to time in a manner similar to momentum’s relation to space. The
mathematical relationship is trivial, but the philosophical relationship is fundamental. If momentum
represents material change of position in space without direct reference to (position in) time, then
energy represents material change of position in time without direct reference to (position in) space.
These interpretations of momentum and energy are not those that are usually expressed, but are
consistent with the uncertainty relationships in which they are found. There is also a precedent for
this view of energy in the field of thermodynamics where the dissipation of energy is considered
“time’s arrow.” The Heisenberg uncertainty principle clearly treats space and time as completely
separate ‘objects’ or physical variables.
5
Beichler
Submitted for publication in WISE Journal
Spring 2016
When viewed in this manner, the uncertainty relationships undoubtedly make separate cases for
time and space. On the other hand, the fundamental structure of physical reality is a combined
space-time according to relativity. This fact alone would place uncertainty at odds with relativity, but
it also suggests the proper procedure to bring the theories together: Recombining space and time as
expressed in the uncertainty relationships should lead to relativity.
Since the combination of factors, whether ‘Δx Δp’ or ‘ΔE Δt’, are each equal to or greater than
the same quantity, h, they can be set equal to each other.
.
And then, by rearranging terms,
.
At this point it is clearly evident that the ratio Δx/Δt bears closer analysis. The ratio looks something
like a speed, but the fact that it represents instead a ratio of two independent uncertainties does not
allow such a hasty interpretation. However, since we are dealing with uncertainties that have limits,
there is no logical reason why a limit to the ratio cannot be established.
There are three possibilities for such a limit. The ratio Δx/Δt can either be less than c, equal to
c or greater than c where c is the speed of light. It is convenient then to introduce a new concept
called ‘relative uncertainty’, by which the complementary quantities of quantum mechanics vary in
such a way that the ratio of Δx to Δt would have a maximum value of the speed of light, or Δx/Δt <
c. The alternative case where Δx/Δt > c can then be identified as an ‘absolute uncertainty’. There is
no why reason these ratios cannot be so named while the justification of such evaluations will soon
become apparent.
The application of relative uncertainty leads to the desired results, in this case, the derivation of
the Lorentz-Fitzgerald contraction. After applying the condition of relative uncertainty to the
combined uncertainty relationships, we have
Since the condition of relative uncertainty has consumed the only factor of Δx and the contraction
involves a distance, we must introduce a new factor of Δx to each side of the equation to
“renegotiate” the uncertainty.
.
Letting Δp go to m0c will later allow us to vary ΔE on the right hand side of the relationship and Δx
on the left. If a quantity on one side is certain, then the uncertainty is inherent in the complementary
quantity.
6
Beichler
Submitted for publication in WISE Journal
Spring 2016
There is no uncertainty in either m0 or the speed c, so these can both removed as uncertainties,
leaving
.
On the other hand, the energy of the system can be represented by the classical expression for
the kinetic energy of an object since the two sides of the relationship must be treated differently.
In the relative case we are not measuring the position of a single point particle as is normally done in
the quantum mechanical case. Instead, we are measuring the relative positions of the endpoints of an
extended object, i.e., the length of an object. So where L is the length at the speed v, the quantity Δx
becomes L on the right hand side of the equation.
In a similar manner, the quantity of Δx on the left hand side of the equation becomes (L0-L),
giving a positive value for the length, which is physically necessary.
Since m0 is not uncertain, it can be withdrawn from the uncertainty term once again.
If the original length of a “test” object is L0, then the length at velocity v would be L. L0 would
become L + ΔL, or rather the uncertainty in the new length at speed v would be L0-L, the amount
contracted. Then, by algebraic manipulation
7
Beichler
Submitted for publication in WISE Journal
Spring 2016
.
The quantity on the right of this relationship represents the first two terms of a power series
expansion. The remaining terms of the expansion contribute to the uncertainty in the speed.
Substituting the unexpanded expression yields
.
Finally, by rearranging the variables, we have an equation which matches the Lorentz-Fitzgerald
contraction of special relativity.
This derivation is admittedly ad hoc and non-rigorous from a mathematical point of view, but still
suggests the degree of compatibility between special relativity and uncertainty. The other formulas
usually associated with special relativity can all be derived in a similar manner, including E = mc2. In
each instance it is only necessary that relative uncertainty must be involved. It would thus seem that
the condition for compatibility of uncertainty and relativity is the fact that the ratio Δx/Δt is less
than or equal to c.
The inverse operation of deriving the uncertainty principle from space-time considerations is no
more difficult, but none-the-less different. It also starts from the same premise of relative
uncertainty.
Which becomes
8
Beichler
Submitted for publication in WISE Journal
Spring 2016
.
From classical physics we have a similar looking but different relationship in
,
but relativistic considerations whereby c is a limiting speed will allow this to change to
.
In other words, all material speeds of real objects can only travel at the speed of light (photons) or
less.
In any relativistic case, it can be expected that a material particle or photon has a well-defined
position in space-time. So, if a material particle or photon is moving through a distance dx during
the time interval dt and there is an uncertainty in one of these quantities, there will be a
corresponding uncertainty in the other quantity. The previous relationship can then be rewritten as
.
This relationship is mathematically equivalent to the above statement representing relative
uncertainty. But from relativity theory we also know that E = mc2 and thus c2 = E/m, so
.
We also have the relationship describing the DeBroglie wavelength of a particle, which equals h/p.
If we allow the possibility that the uncertainty in x is equal to the DeBroglie wavelength, this last
relationship can be written as
,
After which we find that
.
This equation further reduces to
since E = p2/m in the relativistic case. Then by allowing for an uncertainty in energy, we now have
9
Beichler
Submitted for publication in WISE Journal
Spring 2016
,
which results in the more recognizable formula
.
So we have reached the correct relationship for uncertainty.
Like the case for Lorentz-Fitzgerald contraction, there are inherent deficiencies in this
derivation. The fact that the DeBroglie matter wave relationship was necessary to introduce Planck’s
constant and seal the relationship between relativity and the quantum is quite interesting and
significant. It implies that there must be a more comprehensive physical (structural and/or
geometrical) correlation between the concepts of matter waves and the formalisms of relativity than
previously suspected or investigated. However, the fact that the derivation can be completed at all is
evidence that uncertainty can be derived from relativistic principles under the proper conditions.
Therefore, some degree of compatibility must exist between special relativity and the Heisenberg
uncertainty principle.
In fact, the missing underlying elements (later designated as ‘hidden variables’ by David Bohm)
to which EPR referred are not that hard to find because they are hiding or rather unnaturally
‘suppressed’ in plain sight within the mathematics. In the relationship ΔxΔp≥h, the notion of some
form of material change (represented by Δp) is evident, but only location is space is specified and
you cannot have change without a corresponding change in time. So time is the missing element in
this relationship. In the relationship ΔEΔt≥h, some form of material change (represented by ΔE) is
again evident, but without any reference to a change in spatial location. So space is the missing
element in this formulation. Moreover, when these to relationships are equated to one another as
above to derive the formulas of special relativity and Newtonian physics, Planck’s constant h
disappears. Therefore, Planck’s constant h must be the physical binding constant for space and time
to create space-time such that Planck’s constant is suppressed and neither does not nor cannot
appear as a factor within the theory of relativity.
Compatibility vindicates Einstein
In the past, criticisms were leveled at Einstein for supporting the validity of the experimental
evidence verifying quantum mechanics while criticizing the Copenhagen Interpretation. In fact,
Einstein was considered a pariah if not out of touch or suffering from dementia or some other
failing for the remainder of his life by the greater portion of the physics community for his failure to
accept the quantum theory even though he only argued against its most common interpretation.
Einstein’s approach to this quam dilemma was to try and unify general relativity and
electromagnetism in such a way that the quantum would emerge from the mathematics of
unification. He was unable to do this and his method has been all but forgotten in science as a
legitimate effort to come to terms with quantum theory. On the other hand, the above argument for
compatibility supports Einstein’s efforts and/or correctness rejecting the philosophical beliefs of
most of his peers. Einstein seems to have had an intuitive feel for what has been called relative
uncertainty in the EPR argument without ever having articulated the concept. In any case, it should
10
Beichler
Submitted for publication in WISE Journal
Spring 2016
be clearly evident that there is an underlying assumption that the interactions described could only
occur in a relative space-time in the EPR paper, lending support to Einstein and a few other’s (most
notably Schrödinger) intuitive acceptance of something like a relative uncertainly as described here.
The EPR argument also implied, just as it is assumed in relativity theory, that any two particles
and only two real material particles need be used to establish a relative space-time framework. In
other words, if the universe were limited to or consisted of only two material particles, a relative
space-time would still exist so some form of physical reality below the quantum limit set by the
Heisenberg uncertainty principle must indeed exist: The two interacting particles in EPR ‘create’ a
relative space-time framework ‘even though’ if not ‘in spite of the fact that’ the Heisenberg
uncertainties are real and are applied to the situation in question. A precedent for this form of
reasoning was already established by Mach’s principle and there is no reason to believe that Einstein
would deny this view given his acceptance of Mach’s principle.
This view can be better expressed in the statement that any two particles in our own universe
are all that is needed to establish a relative space-time. When Einstein argued in EPR that the reality
of two particles was still assured after their mutual interaction, even though the uncertainty principle
would not allow the simultaneous measurement of both their positions and momentums, he was
merely supporting the view that both of these particles had definable positions and momentums
because in principle they still established a relative space-time framework. This assumption, which is
none other than the theoretical basis of the simplest possible form of relativity rather than a fullfledged ‘model’ of relativity, was the fundamental unstated basis of Einstein’s refusal to accept the
Copenhagen Interpretation of quantum mechanics.
For Einstein to have accepted the Copenhagen Interpretation would have amounted to a denial
of the theoretical basis of simple relativity itself, not just Einstein’s theories of relativity, forcing
Einstein to support a double standard in his physics. Under these circumstances, quantum
mechanics was incomplete (but not the idea of the quantum itself or the experimental results of the
quantum theory) and could never be incomplete in Einstein’s eyes unless science could explain how
the complementary quantities of the uncertainty principle could remain uncertain and still allow the
existence of a definable space-time framework, independent of an observer.
Some scholars have sought to avoid these problems by dissociating Einstein from the EPR
paper. They have suggested that EPR was not written by Einstein and that he only had a limited or
varying amounts of influence over its final form, but it cannot be argued that he disagreed with its
basic propositions, which he openly and vocally supported in later years. The points of debate put
forward in the EPR paper are consistent with arguments that Einstein made against the popular
interpretation of the quantum theory both before and after its publication. Einstein clarified his own
arguments in later publications on the paradox and completely supported EPR, no matter whose
writing and prose style dominated the actual EPR paper.
The possibility of hidden variables suggested another solution to this dilemma, but hidden
variables were never mentioned, either directly or indirectly, in the original EPR paper as many have
concluded or suggested. Hidden variables could certainly guarantee the existence of the necessary
space-time framework even if they existed and even though position and momentum could not be
simultaneously defined or measured. This line of reasoning has proven fruitful for experimental
physics, but has not yet proven either the existence or non-existence of hidden variables. Should
11
Beichler
Submitted for publication in WISE Journal
Spring 2016
such variables exist, they must be so far beyond our present level of comprehension that science has
no idea how they would appear or what form they would take.
Compatibility offers yet another avenue of interpretation which negotiates the differences
between the seemingly incompatible points of view of Einstein and Bohr. In one sense, quantum
mechanics is incomplete because there is a condition, defined as absolute uncertainty, which does
not allow a strictly definable space-time framework. This view accounts for the most extreme
version of the Copenhagen Interpretation whereby reality is completely dependent upon the act of
observation by a conscious being. This notion implies that consciousness must be present before as
well as necessary for the existence of physical reality. Simultaneously, quantum mechanics is
‘complete’ in Einstein’s sense of the word with respect to another condition as expressed by relative
uncertainty. When the ratio of Δx to Δt is either less than or equal to c, quantum mechanics and
special relativity are compatible and the relative space-time framework is completely defined with
respect to any two or more material particles.
The existence of compatibility literally begs for a reinterpretation of the quantities involved in
the uncertainty relationships. Since the uncertainty formulas separate space from time, Δx Δt > h
can be considered a purely spatial relationship while ΔE Δt > h is purely temporal. Δx represents an
uncertainty of position in space and Δp represents the corresponding uncertainty of a material
change in that position with no direct reference to time or a time frame. Classically and
relativistically, a change in position must include a representation of both the matter undergoing
change in position as well as the duration of time over which that change occurs. Matter is clearly
represented in momentum (the m in mv), but time is only indirectly represented (through v) in the
portion of momentum that represents the rate of change of position. However, the very fact of Δp
(where p = mv in the classical case) means that momentum is considered and taken in the
uncertainty principle as a fundamental quantity in its own right that cannot be further subdivided
into m and v and thus dx/dt in order to exclude time directly but still generate a space-time
framework in the background.
If momentum is to be defined as an independent and elementary quantity, even though any
direct reference to time has been deleted from consideration, the quantity Δp must represent the
uncertainty in material change of position independent of time. The total quantity Δx Δp must then
represent the total uncertainty found in any attempt to determine a fixed position while position is
changing without reference to changing time. This concept is difficult to understand since it would
seem that any change in position could only occur over some time interval, however small.
Therefore this uncertainty relationship could alternately be considered a limit to which position in
space could be determined without reference to time. More specifically, it would seem that the
Heisenberg uncertainty principle implies the possibility of measurement of the stationary state of a
moving particle (a paradox in itself) and thus a limit to how closely science can consider a moving
particle stationary (the duration of time is negligibly small) for the sake of an accurate enough
measurement to establish a fixed (relative) space-time framework.
The second Heisenberg relationship considers the uncertainties of energy and time. This
relationship represents an attempt to find position in time independent of position in space. While
Δt represents the uncertainty of position in time, which is clearly definable, ΔE represents the
corresponding uncertainty or extent to which time can be measured without regard to spatial
12
Beichler
Submitted for publication in WISE Journal
Spring 2016
considerations (change of location in space is negligible, whatever that means relative to the energy).
Energy should then be considered a model of material change of position in time independent of
spatial considerations, just as momentum represents a material change in spatial position without
reference to time or a time frame.
In essence, these two uncertainty relationships more appropriately represent fundamental limits
of the extent to which changes in space (spatial position) and time (temporal position) can be
measured while remaining completely independent of one another. Scientists should be asking ‘can
the position in space be completely localized if the corresponding position in time has been
rendered so negligible as to be non-existent and vice versa’? Under these circumstances, the two
formulas of uncertainty almost seem an attempt to return to the Newtonian concept of pure space
and pure time with no mutual physical connection in a modern world of science that has already
proven the attempt futile. When time is independent of space, energy is space-like or acts as a
substitute for space. When space is measured independent of time, momentum assumes the role of a
time-like quantity.
Only when we combine the two uncertainty relationships, thus suppressing h, can we recreate
space-time and regain the corresponding (simultaneous) reality of space-time, leading to the
formulas of special relativity. Since the uncertainty relationships are a model of the extent to which
space and time can act independently, Planck’s constant should be interpreted as the dissociation
constant for space-time although it is more productive to take a positive rather than a negative view
and describe Planck’s constant as a coupling or connectivity constant for space and time in the
continuum. In either case, the value ‘h’ represents the lower limit by which variations of spatial
extension and temporal duration can be considered independent quantities or beyond which
measures of one cannot be made without direct reference to the other.
In this interpretation of physical reality, the ratio Δx/Δt holds a special position that also bears
further analysis. When so analyzed, the ratio can be interpreted in several ways. The statement
Δx/Δt ≤ c, which has been designated relative uncertainty, does not necessarily indicate that a
particle or wave is traveling at or below the speed of light. The quantity c is a universal constant, just
as it is in the relationship E = mc2. In this later formula, the use of c as a constant does not mean
that matter must travel at a speed of c2 to be converted to energy. It is sufficient that a particle
moves at or below the speed of light to yield relative uncertainty, but it is by no means necessary.
There may be other physical conditions (or hidden variables) that yield relative uncertainty regardless
of the rate of change of position of any given particle.
On the other hand, rates of motion greater than that of light are sufficient to yield Δx/Δt > c,
but are not necessary for absolute uncertainty. There may be physical conditions (if not hidden
variables) that lead to a situation of absolute uncertainty without regard to the concept of speed as
the rate of change of position, just as there might exist conditions under which travel or
communication that exceeds ‘c’ might exist without violating relative uncertainty. There are neither
philosophical nor experimental reasons to deny the possibilities of these other physical
interpretations of Δx/Δt at this time. So there still remains a possibility that unsuspected (hidden)
variables or other physical factors play an important role in quantum theory, even though hidden
variables (alone) are no longer necessary to assure the compatibility of special relativity and quantum
theory.
13
Beichler
Submitted for publication in WISE Journal
Spring 2016
The need for further physical analysis and interpretation based upon compatibility is certainly
implied. Perhaps unsuspected physical variables or specifically defined yet undiscovered physical
conditions exist whereby physical interactions act locally, contributing to the substructure of spacetime, yet correspond to the case whereby Δx/Δt > c such that c does not specifically correspond to
a rate of change of position or ‘speed limit’ for material bodies. Unsuspected or unknown variables
which act non-locally, while contributing to the sub-structure of space-time may well correspond to
the case in which Δx/Δt < c and c is not a measure of the rate of change of position. And perhaps
the results of recent experiments which support neither ‘Bell’s inequalities’ nor Einstein’s
interpretation of quantum mechanics and special relativity correspond to the case where Δx/Δt > c,
with c representing a true temporal rate of change of position.
By differentiating between these possibilities while demonstrating how they relate to the present
laws of nature, at least a greater understanding of reality is possible. And only a greater
understanding of the nature of reality can lead to observational and experimental results that can
decide the true reality of nature.
II. Afterthoughts on the fundamental nature of the quantum
The Einstein Certainty Principle
Following the above arguments, science is faced with a new ‘Einstein Certainty Principle’
(previously called ‘relative uncertainty’) that was established by the suppression of ‘h’ in the
Heisenberg relationship. We can start from the common forms of the Heisenberg’s uncertainty
principle.
and
,
which yield upon recoupling by division
.
The inequality sign should change direction because we are dividing by ‘uncertainties’, which is in a
sense the same as dividing by a negative value. Δx and Δp are usually grouped together as
‘complementary’ physical quantities, as are ΔE and Δt, but there is no logical reason why other
‘complementarities’ and relationships among this group of physical quantities cannot exist.
For example, by regrouping we get
(
x p
h
c
)( ) 1
t E
h
c
This equation can be further reduced by splitting the products.
14
.
Beichler
Submitted for publication in WISE Journal
Spring 2016
The products can only be split in such a way that the units are equal on each side of the equation.
The ratio between Δx and Δt was used above and warrants no further explanation, but the ratio
between Δp and ΔE does. In the limit where the velocity v of a moving particle approaches c, the
Δp approaches a value of mc and the ΔE approaches a value of mc2. These maximum values would
correspond to the maximum values in the uncertainties, or at least the maximum values of the
uncertainties could not be greater than the maximum possible measured value of the quantities. So
the ratio of Δp/ΔE approaches 1/c (or the ratio of the limiting values mc/mc2) as the particle speed
approaches c, which is normal. Under these circumstances, both of these conditions can be
identified as expressing an ‘Einstein Certainty Principle’ (or ‘relative uncertainty’).
However, there still remains another possibility when complementary pairs are split: the last two
ratios of Δx /ΔE and Δp /Δt. In this case, the suppression of Planck’s constant is rescinded and we
have
(
x p
h
c
)( ) 1
t E
h
c
.
x p
c h
)( ) ( )( )
E t
h c
By splitting up the uncertainties into a new ratio combination, we have
(
and finally
.
Unfortunately, this set of complementary pairs does not nor could not yield the desired results
because the units are all wrong. In fact, these two ratios, Δx/ΔE and Δp/Δt do not yield any
meaningful data using any combination of the physical constants h and c. However, it was pointed
out above that the time element of uncertainty Δt was suppressed in the relationship Δx Δp > h by
the uncertainty in the momentum Δp, so perhaps the ratio of Δp/Δt should make no sense. On the
other hand, the ratio of Δx/ΔE also forms no relationship, perhaps because the uncertainty in
position was suppressed by the ΔE term in the relationship ΔE Δt > h. So the ratios of Δx/ΔE and
Δp/Δt make no sense in terms of the uncertainty relationships. Nor should they make any sense
because the space-time structure cannot be separated in this manner. Perhaps they are not
complementary pairs in the same sense of the word as the other complementary pairs.
On the other hand, if we begin with original pairing that provided no new information or
‘conditions’ for splitting the groups
15
Beichler
Submitted for publication in WISE Journal
(
x p
)( ) 1
E t
,
Spring 2016
p
E
)(
)
t
x ,
and just rearrange the pairs by algebraic means as
(
then a startling new relationship emerges. The ratio Δp/Δt looks a lot like Newton’s second law
dp/dt, or at least it takes the same form and has the same units. So the term could just as well
represent an ‘uncertainty’ in a force F resulting from a changing momentum as indicated.
F (
E
)
x
E Fx .
Then multiplying both sides by the quantity Δx, we get
However, it is commonly known that the work done in any situation is equal to the difference
between the initial and final energies
Work E ,
which is called the Work-Energy theorem in physics. In Newtonian physics the symbol Δ refers to a
difference or change in the designated physical quantity, yet in many cases that change may
correspond to the quantum defined uncertainty for practical purposes at the quantum level of reality.
The product of the force F acting through the change in position or distance through which the
force acted
Work Fx .
Work F x ,
In its vector form, the work is given as
or in terms of energy
16
Beichler
Submitted for publication in WISE Journal
E F x .
Spring 2016
If the uncertainty in energy is equivalent to the change in energy and the distance through which the
force was applied to do the work or change the energy, then the last form of the uncertainty
relationship
E Fx .
is none other than an expression of the Work-Energy theorem. The Heisenberg uncertainty
principle has now been developed into a mainstream formula from Newtonian physics. In other
words, there is far less of a difference between quantum physics and Newtonian physics than science
has ever suspected.
By the same token, if the DeBroglie matter wave equation
h
p
E
h
x t .
is applied to the combined uncertainty relationship, we get
E
Once again, we get a wholly new formula by simple algebraic manipulation
x
h
t
and if we allow the uncertainty in spatial position Δx to equal the wavelength of the matter wave,
this reduces to
E
and finally
h
t
Et h .
This last algebraic expression is just a Newtonian derived version of the uncertainty principle as
modified by the DeBroglie matter wave.
These derivations are trivial and non-rigorous, not much more than playing with the variables
to see what happens, but they are still informative because they demonstrate that relativity and
17
Beichler
Submitted for publication in WISE Journal
Spring 2016
classical physics must be completely (mathematically) compatible with the quantum at some level.
They also depend on how the uncertainties involved are determined, but they are none-the-less
intriguing. Like the case for special relativity, there seems to be an underlying reality that allows a
strict compatibility between Newtonian physics and the uncertainty principle, while this derivation
questions both the validity of the indeterministic worldview and the particle worldview – neither of
which had anything to do with this derivation or affected its outcome in any manner. Just as these
derivations depended on the specific and admittedly convenient choice of values for the
uncertainties, the indeterminacy and particle worldviews are nothing more than specific and
convenient choices made by individuals rather than the requirements of nature. Furthermore, it is
rather strange that a theory that purports to be non-classical (the quantum theory) could so easily
and trivially be converted into a classical form, while a special condition (Δx/Δt ≤ c) was necessary
to bring space and time together to develop a relativistic type structure. Given these facts, it might
seem that quantum theory is really more ‘classical’ than was previously thought or desired.
This newly proposed strange ‘classical’ nature of the quantum theory is also reflected in the
manner by which the uncertainty relationships reduced to the Work-Energy theorem. The
uncertainty pairs involved could not be split apart in any meaningful manner as had other
complementary pairs of uncertainties, which would seem to indicate that they are somehow related
to each other in a far more fundamental manner than the other pairs that could easily be split apart.
In other words, the concept of energy and its relationship to forces acting over distances is an
extremely fundamental concept that expresses itself at an equally fundamental level with the very
nature of space-time itself.
This comparison further indicates that something far different than ‘normal’ occurs at a more
fundamental level of reality than that depicted by the quantum when space and time are rendered
asunder. Force, energy and space-time are intimately related at the most fundamental level of reality,
which is well below the quantum level of observed fundamental particles. It should also be quite
clear that the Work-Energy theorem formula that was derived from the uncertainty relationships
demonstrates that Newtonian physics is completely compatible with the Heisenberg uncertainty
principle and thus quantum theory in general, as is special relativity.
One further obvious fact that develops when we bring the two forms of Heisenberg’s
uncertainty principle together, such that
(
x p h
)( ) 1
E t
h
,
should also be taken into account. When a scientist or mathematician speaks of statistics or
probability functions, the value of ‘1’ translates as a 100% probability. So, bringing space and time
back together in this manner merely translates as turning the probabilities associated with the
Heisenberg uncertainty principle into a dead certainty, a possible probability of one, further
bolstering the argument that the old debate between determinism and indeterminism is irrelevant to
the reality described by physics, which takes place in space-time and not space and time separately.
Within a fully specified space-time framework, there can be no uncertainty because the uncertainty
18
Beichler
Submitted for publication in WISE Journal
Spring 2016
comes from unnaturally splitting space and time apart, or rather considering them physically
unconnected (or unattached) in specialized experiments.
There is nothing in the above mathematics based arguments that either favors or indicates that
either the continuous or discrete view of ultimate reality is the correct view. On the other hand, the
existence of a physical ‘reality’ beyond the intervention of consciousness or conscious choice is
completely guaranteed in quantum theory by the ratio ΔxΔp/ΔEΔt = 1 even though quantum
theory has previously been interpreted as indeterminate and random. All that is indicated is that a
ratio of Δx/Δt must be < c and the ratio of ΔE/Δp must also be < c. This simple fact could simply
mean that uncertainty and relativity coincide within the boundaries of the light cone in a space-time
diagram, while absolute uncertainty corresponds to the absolute elsewhere. The absolute elsewhere
beyond the light cone could thus correspond to either a physics where space and time are not linked
together in the manner of space-time or the physics of a possible region of reality below the spacetime threshold that is established by the limiting factor of the quantum or Planck’s constant.
In either case, whether one believes that the discrete or continuous worldview is more universal
and fundamental to reality is a matter of personal choice and must be based upon observation, not
opinion. Unless, that is, new information or observations come along to push the choice in one
direction or the other. Otherwise, the fact that the DeBroglie concept of a matter wave was crucial
to the derivation would seem to indicate that waves are more real than the moving particles that they
represent. In either case, this argument does nothing to support philosophical debates whether
reality is indeterminate or determinate and thus implies their complete irrelevance to physics and the
true nature of reality.
Not even wrong relevant
When two real particles physically interact in some manner we only observe (‘see’ or ‘sense’) the
three-dimensionality of relative space through either the mechanical connection (passive) as
described by EPR or the electrical connection (active/reactive), both of which correspond to scalar
potential field structures. However, there exists a far deeper, subtler and more fundamental structure
to space-time (reality) that is based upon the vector potential and need not be restricted to just three
dimensions of space. The particles that have interacted mechanically, as described in the EPR
model, remain connected (entangled) within the four-dimensional geometrical structure of space (or
the five-dimensional structure of space-time) by an EM vector potential field – hence they must be
connected by two forms of entanglement – an electromagnetic entanglement and some as yet
unsuspected form of deeper rooted material entanglement.
The unsuspected material entanglement must be either structural and/or geometrical (hence
passive) and accounts for both the EPR described entanglement (passive) as well as spin
entanglement (active), while the EM vector potential entanglement accounts for polarity
entanglement (active). The passive entanglement is activated when two material particles come
within a quantum of each other and thus initiate an interaction, becoming both spin-coupled (at least
until they interact elsewhere to break the spin entanglement by forming new entanglements) and
mechanically entangled to create the space-time continuum (as described in EPR). All three types of
entanglement are structure based and that structure could only originate in the field or space-time
continuum with which the field interacts.
19
Beichler
Submitted for publication in WISE Journal
Spring 2016
Although it is not obvious, the fact that the logical arguments above make no reference to
whether matter or the position of matter refers to continuous or discrete elements is extremely
significant. Since the equations of special relativity can be derived, even in such an unorthodox and
non-rigorous manner, from the uncertainty principle without any reference to either the discrete or
continuous nature of reality, this derivation implies that the choice of whether matter, time and
space are discrete or continuous at the most fundamental level of reality is no more than a personal
choice. The uncertainty principle is about position in either space or time, not specifically about
particles. As such, the personal choice must be based upon something other than mathematic or
scientific priorities, which introduces the possibility of an a priori bias. Nothing in the physics or
math used up to this point ‘requires’ one form of matter, space and time over the other even though
the mathematics does imply one over the other. Any reference to continuity is assumed in the
fundamental theorems of the mathematical systems rather than the physical arguments themselves.
These derivations further indicate that Planck’s constant should be regarded as a space-time
coupling or connectivity constant. Only when Planck’s constant is suppressed by relating the
separate spatial and temporal forms of the Heisenberg relationship can the two forms be united to
form a legitimate physical space-time yielding the formulas of special relativity. This fact would seem
to give slightly more credence to the possibility that the ultimate physical reality, whatever it might
be, is continuous rather than discrete as posited by quantum theorists and philosophers.
Furthermore, all of the physical quantities involved in the physical derivation as well as the
logical argument are classical physical quantities. No reference has been made to either
electromagnetic waves or other continuous fields in any form, nor has there been any reference to
purely quantum mechanical quantities such as spin that are clearly not well understood or defined at
the fundamental level of physical existence even by quantum theorists. In all the experiments that
have been conducted to date to verify the existence of ‘entanglement’, only the quantities of ‘spin’
and polarization have been shown to be ‘entangled’ at the quantum level of physical reality. This fact
has spawned a small debate within science whether the concept of ‘entanglement’ verifies or fails to
verify and thus negates the relativistic viewpoint proposed in the EPR argument. However, these
experiments neither confirm nor deny either the special relativistic or the quantum mechanical
approach to reality. Nor could they confirm or deny the quantum mechanical view of an ultimate
discrete physical reality because that view is presently incomplete according to the above argument.
However, the same is true in the case of special relativity because the ‘entanglement’ verifying
experiments deal with either electromagnetic quantities (polarization) or quantum mechanical
quantities (spin), which have no classical mechanical equivalents. Special relativity deals specifically
with classical mechanical quantities even though it is based on electrodynamics. Many scientists have
suggested that ‘signals’, whatever form they might take, would have to travel faster than the speed of
light between entangled spin particles or entangled light waves to ‘fix’ their measured values or
otherwise demonstrate the observed ‘entanglement’. However, special relativity does not imply that
the speed of light must be a limit for anything other than classical physical quantities, which leaves
out spin and polarity.
Special relativity is based upon the assumption demonstrated earlier by Maxwell’s
electromagnetic theory itself that the speed of light is constant within the normal four-dimensional
space-time continuum or relative space that is determined by the relative position of material
20
Beichler
Submitted for publication in WISE Journal
Spring 2016
particles and bodies, and no more. Special relativity places no such restriction on spin coupling. In
fact, the fundamental nature of ‘spin’ is unknown except for the fact that it cannot be the same as
the classical notion of spin as in the example of classically rotating material bodies. Therefore,
whether the entanglement experiments that have been conducted so far confirm the validity of
special relativity or not remains an open, completely unanswered and seemingly unanswerable
question.
On the other hand, any argument regarding whether science (physics) is (or should be)
deterministic or indeterministic is completely irrelevant to the physics involved in this case as well as
completely misleading with respect to any attempts to discover the ultimate nature of the quantum
as expressed by Planck’s constant ‘h’. The only concepts relevant to the physics in question are the
connection between space and time on the one hand and the relationship between matter, space and
time on the other hand. The apparent fact that space and time can be rejoined after being separated
in the Heisenberg Uncertainty Principle, such that the particular formulations of special relativity are
derived in a new and unsuspected manner, indicates that space and time are inseparable in physical
reality and must therefore be considered as intimately connected, giving indirect precedence to
continuity and the relativity theory. Planck’s constant ‘h’ must be considered a coupling or
connectivity constant between space and time if it has any physical meaning at all other than a
simple mathematical construct constant like π.
In so far as ‘indeterminism’ has any value or place in physics at all, the term only has meaning
when space and time are artificially or naturally decoupled at the quantum level. This would imply
that any physical or experimental attempt to decouple space-time constraints on physical reality by
packing enough energy and/or momentum into a single material interaction or event, i.e., a highenergy particle collision, would overcome space-time barriers leading to the formation and quantum
packaging of matter, thus resulting in the cascade of new particles and light rays. Unless the new
particles formed during such collisions conform to the natural restraints or physical constraints of
space-time coupling they would be unstable and decay until stable particles are reached and/or
energy is lost through particle momentum and/or electromagnetic radiation in keeping with the laws
of mass and energy conservation. Without understanding these issues and their consequences, the
unification of quantum and relativity cannot take place because scientists can always invent new
particles to explain space-time resonances as excuse not to do good physics.
If all of this is true, then a new question arises as to why and how quantum physics has been so
successful and fruitful in predicting actual physical events at the quantum level of reality? Unstable
particles (field resonances) created during high-energy collisions would ‘bleed off’ in time (possibly
following the constraints of the hyperbolic geometry that is implied by Minkowski space) until
spatial conditions were correct for the formation of secondary particles with the excess energy
dissipating through kinetic energy or gamma rays.
The improbable and unlikely nature of uncertainty
Most physicists would simply claim that there are no problems with the quantum theory at
present based on its many huge successes, but that claim would not be true. These supposedly nonexistent ‘problems’ (especially with the Standard Model) stem from the fact that the various
quantum models of reality do not directly address the basic nature of (as well as the fundamental
21
Beichler
Submitted for publication in WISE Journal
Spring 2016
relationship between) space and time: Quantum theory makes no basic statements about the spacetime continuum even though quantum theory adheres to the physical limits placed on nature by
special relativity while quantum theorists do not understand the true nature or basis of the quantum
itself as expressed by Planck’s constant. Special relativity has never been fully incorporated into the
quantum theory although QFT, ‘quantum field theory’, does take into account the relativistic limits
as material bodies approach the speed of light and Dirac’s equation merges the quantum theory with
special relativity at a superficial level of reality.
However, taking note and compensating for these limits in their calculations does not constitute
a full theoretical unification of special relativity and the quantum theory. If nothing else is certain,
then a least this simple fact is - these theories need to be unified at a vastly more fundamental level
than has been accepted or even attempted in the past. This much was admitted by Julian Schwinger,
a co-founder of ‘quantum electrodynamics’ (QED), one of the primary quantum models that make
up QFT.
Schwinger summed up his view on the situation in 1956.
It seems that we have reached the limits of the quantum theory of measurement,
which asserts the possibility of instantaneous observations, without reference to
specific agencies. The localization of charge with indefinite precision requires for its
realization a coupling with the electromagnetic field that can attain arbitrarily large
magnitudes. The resulting appearance of divergences, and contradictions, serves to
deny the basic measurement hypothesis. We conclude that a convergent theory
cannot be formulated consistently within the framework of present space-time
concepts. To limit the magnitude of interactions while retaining the customary
coordinate description is contradictory, since no mechanism is provided for precisely
localized measurements. (Schwinger, xvii)
He explicitly stated that QED had only reached a specific limit whereby it could not be judged
without reference to an outside framework of space-time. So it would seem that the space-time
framework of relativity theory has never been completely absorbed into the quantum model or
worldview.
This fact did not go unnoticed by other scientists. A similar notion was expressed by Victor
Guillemin in his history of quantum theory.
The ambitious program of explaining all properties of particles and all of their
interactions in terms of fields has actually been successful only for three of them: the
photons, electrons and positrons. This limited quantum field theory has the special
name of quantum electrodynamics. It results from a union of classical
electrodynamics and quantum theory, modified to be compatible with the principles
of relativity. (Guillemin, 176)
According to Guillemin, QED was only “compatible” with the principles of relativity. So the
quantum theory seems never to have been ‘unified’ with special relativity in any meaningful manner,
in spite of the fact that many modern quantum theorists believe that the two have been ‘completely’
unified by QFT and no further work in that area of physics is necessary, wanted or even relevant.
22
Beichler
Submitted for publication in WISE Journal
Spring 2016
Yet more recent publications seem to have lost (or subconsciously repressed) the original
attitude that QED did not ‘unify’ relativity and the quantum. According to David McMahon
Quantum field theory is a theoretical framework that combines quantum
mechanics and special relativity. Generally speaking, quantum mechanics is a theory
that describes the behavior of small systems, such as atoms and individual electrons.
Special relativity is the study of high energy physics, that is, the motion of particles
and systems at velocities near the speed of light (but without gravity). (McMahon, 1)
This statement came at the very beginning of his book on QFT. Yet later in the book, he expresses
the same sentiment a bit differently.
Quantum field theory is a theoretical framework that unifies nonrelativistic quantum
mechanics with special relativity. (McMahon, 20)
It would seem that the ‘combination’ of special relativity and the quantum in QFT has now been
elevated to a full-fledged ‘unification’. It is this attitude that is confounding because assuming a full
unification means that there is no need to go any further in unifying the two fundamental paradigms
in modern physics: It has already been done, or so many scientists would now argue. In fact, the two
theories have never been satisfactorily ‘unified’ and remain to this day ‘mutually incompatible’.
Even Paul Dirac, whose early work opened the road to the later development of QED and
QFT, never accepted the direction that QFT and the particle theory (both of which form the basis
of the standard model) eventually took. In the beginning, Dirac was troubled by the infinities that
kept popping up in the quantum solutions, but even after those problems were purportedly
overcome by ‘renormalization’ and QED was developed Dirac still found room for rejection of the
new ideas. Dirac once stated that “I might have thought that the new ideas were correct if they had
not been so ugly.” (Quoted in Farmelo, 2009) He thought it foolish in QED to “try to advance
particle physics until the interaction between the photon and the electron was better understood.”
(Farmelo, 2009)
By the late 1960s, Dirac still found fault with the QFT and particle theories: He would often
point out “what he saw as a crippling shortcomings of QED and urged other colleagues to develop
a revolutionary new theory to replace the one he had co-discovered.” (Farmelo, 2009) It should be
obvious to anyone and everyone that the QFT and particle theories have serious problems that have
been glossed over in the past by their ‘protectors’ against the advice and beliefs of many highly
recognized and respected physicists. It should also be equally evident that the quantum theory and
relativity have never been unified in any meaningful manner.
On the other hand, special relativity puts time on an equal footing with space in the space-time
continuum yet nobody understands either the true nature of time itself or the true significance of
time in any of the various physical models of reality, let alone the quantum model of time with
respect to physical reality, whatever that reality is. Quantum theory must therefore resort to
probabilistic means and interpretations as a substitute for its own failures and the lack of
understanding of its own true nature. To what then does the ‘idea’ of uncertainty in measurement
truly correspond? Does uncertainty truly represent a fundamental deep-rooted indeterminacy in
nature or does it represent something else such as a ‘coincidence’ problem for measurement? If so,
23
Beichler
Submitted for publication in WISE Journal
Spring 2016
then why does the idea of uncertainty correspond so well to probabilistic interpretations of physical
phenomena within the natural world? Answering these questions is completely and absolutely
necessary before any meaningful and relevant understanding of the nature of the quantum and its
relationship to relativity and space-time can be developed.
All aspects of modern quantum theory, from the basic Schrödinger wave equation to KleinGordon, Dirac, the QFTs and the most recent advanced quantum (wave) models, deal specifically
with energies, either kinetic or potential. These energies are derived from potentials that exist before
the interactions that become energetically driven as expressed in the Lagrangian and Hamiltonian
functions that characterize the psi function in the basic Schrödinger equations. Yet scalar and vector
potentials can only be measured at a point in space because they vary from point-to-point in space,
which is impossible to measure in any physical manner. Nothing can be measured at a ‘point’ in
space even though it may well be determined to occupy or be ‘located at’ a discrete ‘point’ in space.
Perhaps the varying values of potential can be mathematically specified at ‘points’, but they
cannot be physically measured at each and every ‘point’. So the physical quantity that is actually
measured is a potential difference over some ‘extended’ region of space. We merely call it a ‘point’ in
space as the extended region over which a measurement is taken grows or is made smaller and
smaller. That lack of specificity accentuates the difference between the mathematical concept of
potential and its physical reality. Even with this physical truism in mind, we can still only measure
scalar potentials in three-dimensional space. Measuring vector potentials over an ‘extended’ region is
still impossible even at the quantum level.
The quantum theory may adequately account for the vector potential in a roundabout manner,
but it cannot completely explain its existence. In quantum theory, the vector potential is due to
‘quantum interference’ as displayed in the Aharanov-Bohm effect (Penrose, 454), but this does not
really explain vector potential as much as it explains vector potential away or makes an excuse not to
inquire any further. Vectors have length, according to their simple definition, but no vector potential
‘length’ exists in three-dimensional space, even at quantum distances. So the quantum theory is no
better than classical physics with regard to explaining vector potential. The mere fact of its existence
is just accepted as is, without further explanation.
However, the real existence of vector potentials, as verified by the Aharanov-Bohm experiment,
could be simply explained by the existence of a higher dimension of space. While the idea is radical,
it is implied by the classical mathematical model of electromagnetic theory. The cross product in
electromagnetism that explains the classical magnetic force should yield a vector quantity in a higher
dimension. On the other hand, nothing in science, including both classical theories and quantum
theory, either requires space to be three-dimensional (space-time to be four dimensional) or
prohibits the possibility of higher dimensions. If a higher dimension of space is taken as real, then a
great deal in both classical and modern physics including vector potentials begins to make more
sense.
But then a completely different issue is also raised by differences between the concepts of
‘measurement’ and ‘interaction’. A measurement is a special type of physical interaction that implies
a willful act while the term interaction implies something well beyond the same willful measurement.
Interactions, as a larger group than just measurements, also include any strictly physical comparison
to a standard and possibly connections between different physical ‘things’ without the intervention
24
Beichler
Submitted for publication in WISE Journal
Spring 2016
of will. Under these circumstances, the so-called ‘measurement problem’ in quantum theory is no
more than a smoke-screen disguising what is really happening in the physical world of the very small
beyond the limited world of ‘measurements’.
The physical action of ‘measurement’ is sufficient to establish a physical reality (in mind and/or
consciousness), as in the case of quantum measurement, but it is not completely necessary to
establish ‘reality’ outside of the mind and/or consciousness as implied by those interpretations of
quantum theory that necessitate consciousness to collapse the wave function and create reality.
However, some forms of physical interaction between different material ‘objects’ is both necessary
and sufficient to establish physical reality at some level (as well as the relative space-time
framework), and that physical interaction can only be described by some form of relativity, and
perhaps entanglement in the case of the quantum. Thus it would seem that entanglement is a form
of relativity, or at least plays the role of simple relativity in the quantum model of reality.
Yet the professed goal in quantum mechanics is to calculate the probability of material existence
at a ‘point’ location in space rather than the ‘extended’ area or volume of space that is normally
necessary to establish a relative space-time framework. The continuous space of relativity is derived
from the ‘extended’ particles, but quantum theory posits that particles themselves are ‘points’. A
‘point’ in space is dimensionless; hence quantum theory should be completely neutral to the
question of how many dimensions exist in our common space. Paradoxically, space consists of an
infinite number of dimensionless points and is continuous from the abstract point of view of a
mathematical structure while matter in the form of material particles are considered points in the
quantum mechanical view even though they are extended bodies in reality. This argument would
seem to imply that quantum theory is really no more than a mathematical model representing reality
(a mapping of sorts) rather than a true picture of reality.
Some quantum theorists may disagree with portraying particles as extended material bodies, but
then others would completely agree: a silent debate over this issue has raged throughout the
scientific community for decades without any obvious resolution. This silent battle over the reality
of point particles is a true ‘uncertainty’ in its ugliest form. But in reality, there are ‘no’ point particles
– real material particles are extended – so the probability inherent in nature according to the
quantum theory is no more than the difference between the ‘point’ and the ‘extended material
reality’ of a particle and that uncertainty extends in space (as Δx) as well as time (as Δt).
This point-of-view closely parallels the difference between the mathematics and the physical
reality of the scalar and vector potentials within the electromagnetic and gravitational fields. Perhaps
this even means that there is a time field representing a time potential that has yet to be discovered
or that time itself is a potential in a different direction than the normal three directions of space. On
the other hand, the link between space and time must be geometrical in nature or at least the
geometrical structure of space-time is implied. Non-symmetry between space and time is clearly
indicated in the Minkowski signature for space-time. In the formula for proper time, the signature
for space-time is quite evident (+++-).
25
Beichler
Submitted for publication in WISE Journal
Spring 2016
This signature indicates that the fourth dimension of time has a hyperbolic or Lobachevskian
geometrical structure as opposed to the Riemannian geometrical structure of the normal three
dimensions of space. This could possibly mean that space-time is ‘cylindrical’ with the three
dimensions of space closed (Riemannian elliptical), which is an ideal condition for stability, while the
dimension of time is open (Lobachevskian pseudo-spherical), meaning that time never closes (or
pinches off to loop back to the beginning point in three-dimensional space) in the future.
Time should continue ‘as is’ forever just as two parallel lines would never touch even at infinity.
However, this solution would only come to pass if time were Euclidean. Since time is
Lobachevskian, parallel lines spread apart into the distant future as they progress away from the
present point in time (where they meet or come within or as close as a quantum of time) so timelines of events would look more like a ‘blunderbuss’, diverging into the future. Such divergence is
unstable and could account for instability and thus spontaneous decay of some particles (temporary
or pseudo-particles if not just space-time resonances) as well as the relative uncertainty in time that
is demonstrated by the Heisenberg uncertainty principle.
This interpretation further means that the uncertainty is inherent in the quantum ‘concept’ of
measurement itself, rather than either the instruments or the act of measurement. The uncertainty
arises from the coincidence between or the percent of an extended body that overlaps (coincides
with) the point-by-point probability distribution as calculated in quantum theory. This ‘coincidence’
gives rise to an uncertainty in the probability measurements, not in the material reality of the particle
and resolves itself in the mathematical model of measurement through ad hoc renormalization.
Alternately, the uncertainty emerges from the fact that any portion of the length, area or volume of
an ‘extended’ real particle can overlap (or coincide with) the ‘point’ in space for which the theoretical
probability is determinable by quantum considerations as a probability density. Hence, the
uncertainty principle only refers to dimensionless ‘points’ in space or time because it cannot
theoretically specify the reality of the position ‘of an extended body’ in space or time with any
precision.
Reinterpreting the emergence of the Second Scientific Revolution
This reinterpretation of the quantum measurement problem is no different than problems in
theoretically or abstractly specifying extension in space that date back to the ancient Greek
philosophers (Parmenides and Zeno) and the scholastic philosophers (who had to define how to
divide space as an abstract quantity to develop science) of the middle ages of European history.
They were unable to develop an abstract model of space because they associated space with God,
but God and hence space was indivisible. The quantum, whatever it is, is also indivisible as is a
discrete point in space. The uncertainty associated with the quantum is merely the difference
between an ‘extensionless’ and ‘dimensionless’ point in space and the real extension of any material
‘object’ in the commonly sensed space of three dimensions.
Ironically, this same problem was considered and solved in stages that began during the late
Renaissance and continued through the Scientific Revolution with the work of Francesco Patrizi,
Pierre Gassendi, Isaac Newton and others. The problem that they faced was difficult, but not
26
Beichler
Submitted for publication in WISE Journal
Spring 2016
insurmountable: how to ‘reduce’ space as an abstract idea or ‘quantity’ so lengths could be used in
such physical concepts as speed and acceleration. Everyone thought of space as an organic ‘whole’
(whether associated with God or not) that could not be split into parts, either abstractly or
otherwise: Nor did there seem to be a need to split space into parts other than philosophically. The
solution of their problem came when Patrizi proposed the existence of two different forms or types
of space that overlapped: relative space and absolute.
This implied natural dualism of space and/or time is analogous to the difference between the
‘point’ particles of quantum theory and the extended material particles that characterize reality.
Absolute space could not be split up, broken down or reduced in any way, means, shape or form.
However, relative space was determined by the positions of material bodies in relation to each other
so measures such as length could be made by comparison to other objects, rather than relative to
any absolute non-reducible concept such as (absolute) space. All of the different possibilities of all
relative spaces were contained within the one single absolute space or fixed three-dimensional
Euclidean framework, which could not be reduced in any manner, as were real measurements in
relative space, not even in thought alone. So it would seem that the ‘measurement problem’ dates
back much further than the development of modern quantum theory although modern quantum
theory has just placed the same old problem within a new context.
In the Newtonian worldview, absolute space, and only absolute space, was a ‘thing’ (as defined
by its physical properties) and the positions of material objects by which relative distances were
judged all existed inside (or were contained within) that absolute space. In other words, relative
space was a relationship of sorts for judging positions rather than a ‘thing-in-itself’ so it could not
nor need not be divided into any manner of smaller parts, although this was not explicitly stated by
Newton and had to be corrected and explained in greater detail by Einstein. Yet lengths in relative
space could still be measured according to their relationships to material ‘things’, even though the
abstract idea of relative measurements was incomplete until much later.
Meanwhile, absolute space was identified with the ‘infinite’ and eventually God, so it was
completely ‘indivisible’, without exception. This dual nature in the abstract concept of space is still
reflected in the difference between the mathematical points (infinitesimals) that constitute the
continuity of space and the actual extensions of material objects by which we judge the extension of
relative space itself as well as the quantum concept of ‘point’ particles (literally material
infinitesimals) that fill the vacuum, which is itself infinite in the sense of undefined rather than
infinitely extended.
However, for the purposes of relativity and the quantum theories, the important point is that
scientists and philosophers have been debating the problem of the ultimate reducibility (or
irreducibility) of space since before the abstract concept of space even entered science. Einstein
merely brought the discussion to a new and more sophisticated level by showing that the absolute
frame of reference was completely unnecessary because all of physics could be conducted
completely within a relative space-time framework. On the other hand, Planck discovered the lower
limit that could be used to establish the relative framework of space-time and thus ‘measure’ physical
reality accurately. It is not so much that Planck’s constant represents a lower limit to measurement,
but that it is also a lower limit to the physical reality of those types of things that can be measured as
long as the fundamental material particles are the ‘smallest’ rulers by which to measure extension in
27
Beichler
Submitted for publication in WISE Journal
Spring 2016
space and/or time. Planck’s constant is thus a physical constant of space and time that must be
adhered to in the natural formation or creation of fundamental particles and waves: Planck’s
constant is an inherent property or physical characteristic of space-time itself by which real extended
particles are defined.
The space of common perception is three-dimensional and no one would argue with that
observation. The three-dimensionality of space is determined by the relative position of observed
material objects: No one would argue with that observation either. However, there is no guarantee in
either case that there are no other dimensions to space-time than the normal three as long as matter
is limited to these three. In reality, only matter in the form of ‘material particles’ that are limited in
size by Planck’s constant are three-dimensional.
Nothing in either relativity or quantum theory limits the dimensionality of space and/or time
beyond the constraints of the material particles by which we perceive space and time. In fact,
quantum theory only places a limit on the ability to measure a physical quantity that is defined at an
‘extensionless’ point while it is changing position over time. There are three such limits; one for
three-dimensional space (permittivity), one for the four-dimensional space in which the normal
three-dimensional space is embedded (permeability), and the last for coupling space and time
together (Planck’s constant). Given the possibility that there seems to be a ‘hidden’ reality that exists
beyond the quantum limit and even that the dimensionality of space is questionable, the implication
that reality is ‘discrete’ has no logical foundations. The discrete nature of reality is merely a
philosophical choice that humans impose on nature.
In the case of the quantum, infinitesimals and continuity are thrown out and mistakenly
replaced by a concept of particulate discrete point reality. However, quantum theory does not treat
Planck’s constant ‘h’ as a limit because a limit to the ability to conduct physical measurements would
imply that ‘something’ (fundamental to reality) that is continuous passes through and beyond that
limit. Quantum theory instead sees ‘h’ as a limit to reality itself or a final point to distinguishing the
particle or discrete nature of reality. Quantum theorists shave clearly imported their belief in the
existence of ‘particles’ into the argument from outside (an a priori bias) rather than allowing the
physics and nature to determine whether reality is continuous or discrete at its most fundamental
level.
Quantum theory merely assumes that only (discrete) point particles exist within the quantum
vacuum, quantum field or whatever it is called at any given time, while the quantum vacuum cannot
be further subdivided in any manner, introducing a new duality into nature to replace the Newtonian
duality of relative and absolute space. The physical difference between point particles and extended
particles is thus subverted or circumvented rather than being solved and scientists have been
misdirected in their search for a complete unified theory of physical reality. It’s not so much that the
quantum forms a new dualistic structure with respect to relativity theory or that the quantum theory
is just filling or evolving within some dualistic need in physics that corresponds to the older
Newtonian concept of absolute space, but rather that the quantum theory is, has become and fully
utilizes a Newtonian style of absolute space-time as a modern form of Newtonian absolute space
and time. When seen in this manner, the notion that Newtonian science was at full bloom one day,
at the very top of its form and run of its successes, and modern physics completely overthrew and
replaced it the next day in 1900 to form a historical paradox and discontinuity, as everyone has
28
Beichler
Submitted for publication in WISE Journal
Spring 2016
believed, is complete rubbish. The successes of Newtonian theory brought on and are continuous
with modern physics, while Newtonian physics has never been replaced or overthrown in the
scientific revolution of 1900-1917.
Quantum theory has gotten by and made its reputation as being completely anti-Newtonian
(anti-relative and anti-classical) and quantum mechanics has billed itself as the modern replacement
for Newtonian mechanics. Yet the truth of the matter is far more complicated than that. While
special and general relativity are the ultra-relativistic space-time extensions of the Newtonian
concepts of relative space and time, quantum physics has become the modern space-time extension
of Newtonian absolute space and time. So it is no coincidence that the physical characteristics of the
quantum and Newtonian absolute seem to be similar. Modern relative space-time requires a field
(either single or unified) to make it whole and substantial while modern quantum theory requires a
quantum vacuum or quantum field. Modern replacements for the older Newtonian concept of an
aether, to make it whole and substantial. Photons and virtual particles are no more than individual
points in space along the leading wave of a classical Huygens (principle) wave structure where new
waves are generated. Within this context and many, proponents of quantum theory and the
Copenhagen Interpretation have prided themselves on the their revolutionary advances and break
with Newtonian theory, but in fact they have only fooled and convinced themselves with their own
historical propaganda that they have perpetrated something wholly and completely new as well as
unique in physics.
The old argument that relativity and the quantum are mutually incompatible and thus physically
exclusive of each other because the quantum represents the discrete in nature and reality represents
continuity in nature is merely a philosophical proxy for the more fundamental geometrical
differences between point (discrete) and extension (continuity). Using this philosophical proxy has
allowed quantum philosophers and theoreticians to characterize the quantum theory as a nongeometrical theory based on probabilities as opposed to the geometrical (metric) nature of relativity
because discrete particles are countable. Yet quantum philosophers also claim that the discrepancy
between the quantum and relativity is due to the measurement problem and that the measurement
problem as a fundamental aspect of the quantum theory even though measurement is technically a
characteristic of geometry as opposed to the counting of numbers in probability theory and the
classical field of mathematics known as arithmetic. And of course the statistics used in quantum
theory also deals with counting numbers and arithmetic rather than the actual classical notion of
geometry which depends on measurement. The difference between arithmetical counting and
geometrical measurement renders it necessary to express the location of quantum particles (points)
in space as probability densities instead of extended material bodies even though they have always
been measured as extended bodies that occupy specific extended locations in space and space-time.
With this in mind, the whole philosophical concepts of determinism and indeterminism fall
apart. Nature and the natural world are neither deterministic according to classical physics nor
indeterministic according to the quantum. These arguments were just invented by quantum scientists
and philosophers who saw themselves as the winners of the Second Scientific Revolution to bolster
their claim to uniquely represent the most fundamental physical aspects of reality. Newton’s own
concept of determinism was expressed in the final Queries of his Opticks. To him, God and only
God was deterministic and God was associated with absolute space and time. The later Newtonian
29
Beichler
Submitted for publication in WISE Journal
Spring 2016
concept of determinism was expressed by Simone de Laplace while he explained Newtonian physics
to Napoleon Bonaparte. Yet in this case, Laplace was merely bragging to Napoleon and trying to
elevate his own political position, but his expressed opinion of the Newtonian concept of
determinism was a fitting expression of Newtonianism so it was adopted by the winners of the
Second Scientific Revolution as an example of the older scientific paradigm that claimed to have
overthrown. Yet even Newtonian determinism is false because chaos theory in mathematics and
physics allows the emergence of complexities in nature whose later physical characteristics cannot be
determined from the initial physical conditions from which they emerged. Given that and the fact
that nature is not and cannot be indeterministic, the correct inference is that nature can be neither
deterministic nor indeterministic and any questions as to this characteristic of nature are irrelevant.
The probabilities expressed by quantum mechanics are not found in nature and nature is not
indeterministic. Instead, but the mathematized quantum theory actually imposes the probability and
indeterminism on nature.
By wrongly adopting this new duality of relativity and the quantum as mutually incompatible,
quantum scientists and philosophers have been mistakenly led to conclude that reality does not exist
until consciousness collapses the wave function to create material particles and with their creation a
relative space or space-time. The same could be said for a Newtonian absolute space which was
associated with human mind and consciousness, holding up the advance of science for more than a
century. Quantum theorists can only get away with this fundamental philosophical error because the
vacuum, whatever it is, seems to have no properties while a purely relative space, one that only exists
relative to the material particles by which it is determined, would also have no properties. Although
a relative space would ideally have no properties, the fact that this conforms in part to the quantum
discrete view of reality mutes any questions whether reality extends (continuously or otherwise)
beyond the quantum limit.
However, there is a third physical entity or ‘thing’ that still needs to be considered: The field.
While neither the vacuum nor relative space has any internal properties of their own, fields do. In
fact, fields have properties that characterize how they interact with both the vacuum and relative
space as well as internal properties. The scalar and vector potentials which are expressed by the
Hamiltonian and Lagrangian functions used in the Schrödinger equation constitute the various real
‘fields’ that fill each and every ‘point’ in space and space-time.
So the interaction between the existing physical fields and quantum, regardless of any questions
regarding the fundamental discrete or continuous nature of ‘reality’, seem to indicate that relative
space (the vacuum of empty space) does have properties: The speed of light is one in question. The
fact that so-called empty relative space can limit the speed of light means that even relative space has
field properties; or more accurately the field displays special properties due to its interaction with
relative space as well as the ‘vacuum’. Those properties are permittivity and permeability, both of
which are needed to determine the speed of light through any medium. Therefore space can be
considered a ‘thing’ in is interaction with fields even though it is still relative within itself.
So what seems an inherent probability in nature when trying to localize a measurement to a
discrete point in space and/or time is actually reality ‘extended’ in the space and time directions in a
mutually interacting space-time, which is for all practical purposes continuous (although not
fundamentally) rather than discrete. Since particles are real and extended in space, any part of their
30
Beichler
Submitted for publication in WISE Journal
Spring 2016
extension which overlaps the ‘point’ or rather any ‘points’ in space where the particle is
probabilistically ‘predicted’ to exist by quantum mechanics (the probabilistic potential position) is no
more than the overlap or a coincidence between the quantum mapping and external reality. The
probability densities of quantum mechanics are just mathematical point mappings that simulate
reality, not reality itself. Nor do they somehow replace reality or show that reality does not exist
without the intervention of either consciousness or entanglement. In other words, there is a physical
reality beyond the quantum limit or as Einstein would have said (and did) some ‘elements of reality’
are missing from the quantum picture so quantum theory is, in fact, incomplete.
Conclusion
On the face of it, relativity and quantum theory seem to be mutually incompatible because the
first represents a continuous reality and the second a discrete reality. However obvious this
statement may seem, it is untrue. The discrete and continuous worldviews are indeed incompatible,
but relativity and the quantum theory are not and therein lay the problem and the misinterpretation.
As both a mathematical formalism and a ‘picture’ or nature, the quantum theory is not necessarily a
theory of the discrete aspects of physical reality. Therefore, there is no compelling need or pressing
reason for special relativity and quantum theory to be incompatible in either nature or reality. The
particulate worldview may well be sufficient as a quantum theory, but it is not necessary to fill all
quantum requirements. A continuous theory or model of the quantum is every bit as valid as a
particulate model. It is important to know this because it is necessary to understand the true nature
of reality before the emergence of the Second Scientific Revolution can be accurately interpreted and
understood.
Special relativity and the quantum theory are completely compatible under the proper (physical)
conditions which means that the equation of one can be mathematically derived from the other. To
do so introduces two new types of uncertainty have been designated ‘relativity uncertainty’ and
‘absolute uncertainty’. These uncertainties are expressed in the relationships Δx/Δt ≤ c and Δx/Δt ≥
c, respectively. In both cases, considering c as a speed may be sufficient to establish the
uncertainties, but it is not necessary. In other words, the constant c is just a number in these cases
and need not as yet have a physical interpretation. While this may seem strange to anyone
indoctrinated in normal quantum theory, it is natural and perfectly understandable when it is pointed
out that ‘c’ is not a fundamental quantity, it can be broken down into more fundamental physical
constants: permittivity and permeability.
If permittivity and permeability are defined as connectivity constants for three- and fourdimensional space, respectively, then ‘c’ is a coupling or connectivity constant for the transition
from three to four dimensions. The condition Δx/Δt ≤ c and its companion relationship ΔE/Δp ≤
c together form what can be called the ‘Einstein certainty principle’ for defining or determining
space-time. Planck’s constant h can be regarded as a space-time coupling or connectivity constant
because suppressing it allows space and time to be separated into the Heisenberg formulas ΔxΔp ≥
h and ΔEΔt ≥ h. These findings justify Einstein’s opinion that quantum theory is incomplete as
expressed in EPR, but new questions are raised in the process.
If special relativity and quantum theory are compatible under any conditions, as demonstrated
in this argument, then where does the inherent incompatibility in the differences between the
31
Beichler
Submitted for publication in WISE Journal
Spring 2016
discrete (quantum) and continuous (relative) views of reality come from? What is its origin? The
answer is simple, relatively speaking. At the fundamental level of reality where they are shown to be
compatible, neither discreteness nor continuity has yet become an issue or necessity of nature. Quite
simply, the Heisenberg uncertainty principle neither implies nor requires that matter must be
particulate at a far more fundamental level than where elementary particles are normally observed, as
surprising as that might seem.
The real question is not whether reality is continuous or discrete at the most fundamental level
of reality, nor is it about particles themselves because the reality of elementary particles has been
confirmed. It is about whether the act of measurement (whether detecting or observing reality) or
the value measured is certain or uncertain and that question implies questions about extension in
either space or time if not both. It is irrelevant whether the extension in space and time is taken
between two points or the outer boundaries of two extended particles, as long as the distance
between the point center of the particles and the particle’s outer surface are taken into account.
The Heisenberg uncertainty principle has been grossly misinterpreted and misrepresented in the
past: It should now be interpreted as representing the uncertainty in measuring the ‘instantaneous’
position (theoretically as close a measurement of a stationary position as can be made while an
object is non-stationary) of a material object in space that is moving or the ‘instantaneous’ position
in time of a material body within the natural flow of time (and is thus not changing its energy state).
In either case, nothing requires that the material body be either particulate or not, just measurable.
Quantum theory, as it is now presented in physics and science, is based on an a priori
assumption of the particle (discrete) nature of reality, but nothing in either physics or nature itself
requires it. Technically, the same is true for continuity, but continuity has a stronger case for
representing the fundamental nature of physical reality because continuity is built into the
mathematics used to model physical reality as well as the definitions of other physical concepts or
measureable quantities such as acceleration. Otherwise, scientists and philosophers who support the
particle theory in quantum mechanics must have an unwarranted and a priori predilection toward that
point-of-view because the theory that the particulate model is based implies neither the existence
nor the necessity of particles. Scientists are forcing particles on nature to explain ‘all levels’ of
physical reality even though scientists can only detect real physical particles at the atomic and closely
related sub-atomic level.
The Heisenberg uncertainty principle implies nothing more than a limit to measurement and
particle creation (during high-energy experiments in space-time) and these do not rule out the
possibility of a continuous field as opposed to particles in a vacuum to explain a more fundamental
level of reality. Quantum theory may support the reality of fundamental particles, but it does not
require the discrete nature of reality at ever more fundamental levels. Nor can we say that continuity
is the ultimate reality at this time because we cannot ‘test’ or verify that particular hypothesis by
measurement either. Quantum theory can exist and remain relevant and valid without specific
reference to an ultimate discrete nature of reality. The established experimental fact that particles
exist at the atomic level does not guarantee that continuity cannot exist below the level of reality
where science is limited to measure only particles.
However, if we assume continuity stretches down to and beyond (below) the quantum limit of
measurement, where we can still verify the existence of various fundamental particles and thus
32
Beichler
Submitted for publication in WISE Journal
Spring 2016
material reality, the complete unification of relativity and the quantum into a single comprehensive
theory proceeds both easily and naturally. In other words, relativity and the quantum theory are
correct in their own domains of application as far as they go, but the particle hypothesis limits them
both and misleads scientists in the continuing progress and development of science. Under these
circumstances, all arguments and debates over the deterministic and indeterministic interpretations
of nature are completely irrelevant to physics and represent no more than diversions based upon
personal biases. This conclusion necessarily raises other questions, such as ‘where and how does
probability enter nature’ and if incorrect then ‘why has the probabilistic interpretation of quantum
theory been so successful’?
The probability problem is not even related to the concepts of physical discreteness and
continuity, but reflects how the models based upon those concepts affect the act of ‘measurement’
itself. This problem actually dates back more than two millennia. Quantum theorists represent
physical particles as ‘points’ in space and/or time, but material particles are physically extended
bodies in both space and time, not points. The probabilities arise from the differences between a
‘point’ and an extended particle. ‘Points’, also called ‘infinitesimals’, cannot be measured - in any
given extension in space or time there are an infinite number of points. Neither space nor time can
be subdivided as was demonstrated and debated during the middle ages by the philosophers and
scholastics of that bygone era. The only difference is that science no longer complicates these issues
with notions of God. However, we can still make measurements in both space and time by
comparing the relative sizes of things. This notion defines the essential problem that decimated the
Pythagorean cult in ancient Greece and Zeno made famous with his various paradoxes.
The quantum theory only puts a limit on our taking or making measurements by comparing the
relative sizes of things, which introduces the probability into nature by way of the abstract notion of
measurement. The problem is not in the act of measurement but in the difference between the
scientific concept of what is being measured and the thing measured. The probability represents the
percent difference in the alignment or coincidence between the object measured and the standard by
which it is measured (compared) and the standard is conceptual. If some ‘thing’ is not measured or
compared to some standard, then there is no probability involved. Thus the probability does not
exist in either nature itself or the instruments being used for measurement by the comparison
method. The probability is conceptual because the problem is perceptual and arises in the act of
measurement by comparison of the size of real extended objects.
A simple conceptual truth should demonstrate this idea much better: It is impossible to
accurately measure a stationary object with a moving ‘ruler’. Conversely, it is impossible to accurately
measure a moving object with a ‘stationary ‘ruler’. In either case, the degree of accuracy changes as
the physical conditions by which the measurements are being made change. This simple truth
introduces the relativity of motion into the uncertainty worldview. In other words, it is impossible to
measure the size and/or position of a material object by comparison (coincidence) to a physical
standard if the object and standard are moving relative to one another, even if either the object or
the standard is stationary.
The object and standard create a relative space between them (through passive interaction) that
introduces a minimum error of ‘h’ into the measurements, even when the relative motion is zero
measurement, which is by its very nature an active interaction and thus disrupts the passive nature of
33
Beichler
Submitted for publication in WISE Journal
Spring 2016
the relative space-time. The notion that we ‘can’ measure something without disturbing it is at fault.
Any measurement is by its very nature disruptive by some small amount. This error increases as the
relative motion (speed or momentum) between objects measured or object and measurement are
increased, which in turn increases the ‘improbability’ and thus the ‘uncertainty’ of making a perfectly
accurate measurement.
The ancient Greeks could neither solve nor state this problem precisely, because they had no
concept of speed. They had no concept of speed because they could not differentiate between
different speeds, or rather they could not develop a concept of changing speed (among other
things). This problem was not solved until the middle ages when concepts of uniform and diform
(accelerated) motion were developed. The central problem, even after crude abstract concepts of
space and measurement were developed, was understanding the concept of an ‘instantaneous’
change of any kind while maintaining a concept of physical reality. Quite simply, an ‘instant’ was
essentially zero time and it was impossible to divide by zero.
So how could anyone divide a distance by ‘zero’ time to derive a true ‘instantaneous’ speed?
When applied to the concept of space and measurement, this problem reduced to the notion ‘how
close is close’ and that is exactly how the scholastics thought of and solved the problem – by using a
graphical representation of changing speed and looking at changes in speed forever smaller
durations of time. The problem was not solved algebraically until several centuries later by the
development of the concept of limits. Yet today this problem still survives and befuddles the
greatest minds in science and philosophy under the guise of the debate between the discrete and
continuous natures of physical reality and thus affects our definition of the quantum itself. The
quantum, as represented by the Planck constant h, is just the lower limit of physical measurements
in space and time by which we measure or judge (experience) physical reality because we cannot go
to or physically approach the limit of zero time at the same time as we measure a non-zero distance.
The latest developments, theories and models within the theoretical quantum pantheon are no
different. The most advanced model of the quantum, the standard model, suffers from the same
problem. All interactions between ‘particles’ are mediated by ‘exchange particles’. The exchange
particles supposedly allow real physical interacting ‘particles’ to change speeds without ever having
accelerated. They just make quantum jumps from one constant speed to the next. This may work as
an approximation of an acceleration, especially when the distances between interacting real particles
is extremely small, but it is only an approximation to reality in the end. QED and QFT are just
extremely accurate physical approximation techniques and do not directly portray reality and its
measurement.
Reality is a product of constant change, not just bumped up constants devoid of constant or
continuous change. Acceleration is real, but it cannot be explained by quantum jumps between
constant speeds no matter how short the time interval between jumps. This fact is especially true in
the case of gravity, because gravity is such an extremely weak force at any level of physical reality
and acts over such extremely large (astronomical) distances that any accuracy advantage gained by
the quantum substitute (approximation technique of QFT) for acceleration due to the short
distances and times involved in virtual (imaginary) particle exchanges is lost. In other words, there is
no such thing as gravitons, the suggested ‘exchange’ particles for gravitational action, because gravity
34
Beichler
Submitted for publication in WISE Journal
Spring 2016
is far too weak or ‘small’ an effect to be particulate or discrete: It falls far below the ‘radar’
established by the quantum limit and thus physical constraints of space and time.
The particle theory of matter, not the quantum theory but the particle theory alone, has now
reached a level of pathological redundancy whereby the properties of particles (rather than just
changes in physical quantities associated with particles) require other particles for their explanation.
These new particles will eventually require other particles and then still other particles, ad infinitum.
It will eventually take an infinite number of different and unique particles to justify using a particle
model to replace the continuous model of reality. There are no Higgs particles and there is no Higgs
(aether) field: The ‘point’ particles of the quantum theory are no more than the mathematical
‘singularities’ that general relativity or any continuous theory predicts at the center of mass (the
density of a particle as the radius of a particle goes to zero) of all elementary particles, the Higgs
boson is no more than the real spatial curvature (or extension) that extends between the center of a
particle (the singularity) and its outer material boundaries and the Higgs field is no more than the
real spatial curvature outside the boundaries of any and all real extended particles. So we can picture
gravity as providing the basic geometrical structure of the universe in terms of an extrinsic spacetime curvature in a real higher dimension of space as specified by relativity.
The bottom line is this: Both the mathematics and experimental results of quantum theory are
fully compatible with a continuity model of reality, while the discrete model is not compatible with
relativity and runs into insurmountable problems when applied to physical reality. So the discrete (or
particle) model of reality only exists because people choose to interpret reality in that manner, for
whatever reason, but not by nature’s calling or instructions. If our interpretation of nature is not
dualistic in this manner, then the physical reality of space and time are themselves dualistic and must
be treated as so.
Einstein fought the right war by challenging the quantum theory, but chose the wrong battle in
EPR. Yet the modern interpretation of quantum mechanics, which is wrong, leads to an
indeterministic and probabilistic nature of the world, which must also be wrong. The error is in the
fact that science believes that it has the ability and can make absolutely perfect measures of things
and that these measures are equivalent to complete knowledge about the things measured. However,
science can have knowledge of ‘things’ (a theory) without absolutely perfect measures of the thing.
Science does not really have the ability to perfectly measure any ‘thing’, but introduces an
improbability into any measures it tries because it is unwilling to accept this simple truth: The only
way to have a perfect measure (knowledge) of a thing is first and foremost to be that thing, which is
not possible. Quantum theory is therefore not about determinism and indeterminism, nor is it about
probabilities that are built into nature itself. The quantum merely marks the physical boundary
between static (absolute) and dynamic (relative) interpretations of nature.
So the Second Scientific Revolution was not really about the failure to detect the aether or
explain blackbody radiation, those two experiments that supposedly caused (but really just triggered)
the revolution. The revolution was really about how scientists came to represent and reinterpret the
concepts of space and time as Newtonian physics progressed beyond its own fundamental principles
as well as rethink their relationship with each other according to their newly determined
fundamental aspects. The new views of both space and time that emerged as space-time were heavily
influenced by two discoveries made earlier in the nineteenth century: The concept of a physical
35
Beichler
Submitted for publication in WISE Journal
Spring 2016
‘field’ (the continuum) replaced the Newtonian necessity for an aether to render space substantial
while the new non-Euclidean geometries (especially Riemannian geometry) came to replace
Euclidean geometry as the standard for measurements in the field.
.................................
BIBLIOGRAPHY
Y. Aharanov and David J. Bohm, “Significance of electromagnetic potentialism in the quantum
theory”, Physical Review, 115 (1959): 485-491.
John S. Bell, “On the Einstein-Podolsky-Rosen paradox,” Physics, 1 (1964): 195-200; reprinted in
Bell, Speakable and Unspeakable in Quantum Mechanics (Cambridge: Cambridge University Press,
1987): 14-21.
John S. Bell, “On the problem of hidden variables in quantum mechanics”, Reviews of Modern Physics,
38 (1966): 447-452; reprinted in Bell, Speakable and Unspeakable: 1-13.
David Bohm, “A Suggested Interpretation of the Quantum Theory in Terms of “Hidden Variables”
I”, Physical Review 85 (1952): 166–179.
David Bohm, “A Suggested Interpretation of the Quantum Theory in Terms of “Hidden Variables”,
II”, Physical Review 85 (1952): 180–193.
Niels Bohr, “Can the Quantum-Mechanical Description of Physical Reality Be Considered
Complete?” Physical Review, 48 (1935): 696-702.
Robert Deltete and Reed Guy, “Einstein and EPR”, Philosophy of Science, 58 (1991): 377-397.
Albert Einstein, Boris Podolsky and Nathan Rosen, “Can the Quantum-Mechanical Description of
Physical Reality Be Considered Complete?” Physical Review, 47 (1935): 777-780.
Graham Farmelo, “Paul Dirac, a man apart.” Physics Today 62, 11 (November 2009): 46-50.
Arthur Fine, The Shaky Game: Einstein, Realism and the Quantum Theory (Chicago: University of Chicago
Press, 1986).
Edward Grant, Much ado about nothing: Theories of space and the vacuum from the middle Ages to the Scientific
Revolution. New York: Cambridge University Press, 1981.
Victor Guillemin, The Story of Quantum Mechanics. New York: Charles Scribner’s Sons, 1968.
Helga Kragh, Quantum Generations: A history of physics in the twentieth century. (Princeton:
Princeton University Press, 2002)
Erwin Schrödinger, “Die gegenwärtige Situation in der Quantenmechanik”, Naturwissenschaften, 23:
807-812; 823-828; 844-849 (1935). Also as “The Present Situation in Quantum Mechanics”,
translated by John D. Trimmer, Proceedings of the American Philosophical Society, 124, 323-38; also
appeared as Section I.11 of Part I of Quantum Theory and Measurement, J.A. Wheeler and W.H.
Zurek, editors, New Jersey: Princeton University Press, 1983. Available online at
<http://www.tu-harburg.de/rzt/rzt/it/QM/cat.html#sect5>
John von Neumann, Mathematische Grundlagen der Quanten-Mechanik, Berlin: Julius Springer-Verlag,
1932; In English translation (Princeton: Princeton University Press, 1955).
Roger Penrose, The road to reality: A complete guide to the laws of the universe. New York: Knopf, 2005.
Julian Schwinger, editor, Selected Papers on Quantum Electrodynamics. New York: Dover, 1958.
36