File spoon-archives/technology.archive/technology_2000/technology.0006, message 73


Date: Tue, 27 Jun 2000 22:41:51 +0200 (MET DST)
From: Arun-Kumar Tripathi <tripathi-AT-statistik.uni-dortmund.de>
Subject: The relationship of complexity and predictability is also similar


Dear technologists,

Here is the one paper for you to raise some points and give any new
scientific and philosophic ideas regarding "Science as a Language, the
Non-Probativity Theorem and the Complementarity of Complexity and
Predictability" -This paper in its current pre-publication format has been
prepared for discussion purposes with the hope of creating some dialogue.
The author respectfully requests the readers comments and criticisms which
can be emailed to Prof. Robert Logan at <logan-AT-physics.utoronto.ca> or to
Arun Tripathi at <tripathi-AT-statistik.uni-dortmund.de> Thanks.--Arun


Science as a Language, the Non-Probativity Theorem and the Complementarity
of Complexity and Predictability
by Robert K. Logan, Dept. of Physics, University of Toronto
<logan-AT-physics.utoronto.ca>

Abstract: A linguistic analysis and a formal mathematical proof is
presented to show that science can not prove the truth of a proposition
but can only formulate hypotheses that continually require empirical
verification for every new domain of observation. A number of historical
examples of how science has had to modify theories and/or approaches that
were thought to be absolutely unshakable are presented including the shift
in which linear dynamics is now the anomaly and non-linear dynamics the
norm. Complexity and predictability (as in the opposite of chaos) are
shown to have a complementarity like that of position and momentum in the
Heisenberg uncertainty principle. The relationship of complexity and
predictability is also similar to that of completeness and logical
consistency within the context of Goedel's Theorem.

Science as a Language

In a study dating back to 1995 language was shown to be not merely a
medium of communication but an information processing tool or system that
changes the way we think (Logan, 1995). Building on this idea it was shown
that speech, writing, mathematics, science and computing represent an
evolutionary chain of languages. This hypothesis was justified by first
showing that each of these languages has its own unique semantics and
syntax. Next it was demonstrated that each new language arose as a
bifurcation from the set of older languages each time a new level of
information processing was required. Writing and math, the second and
third language, arose from the need to keep records of agricultural
commodities given as tributes to Sumerian priests for redistribution to
irrigation workers. (Schmandt-Besserat, 1986). Writing and math gave rise
to formal education and schools. The scholarship of the teachers who
taught in these schools produced a new information overload. The need to
organize the new knowledge created by the scholars resulted in the new
language of science, the fourth language. The information overload
generated by science gave rise to need for the development of automated
information processing and resulted in the advent of computing, the fifth
language.

Within the framework of this model of the evolution of language,
mathematics and science are distinct languages each with its own unique
informatic objectives. Mathematics strives to solve equations and to prove
the equivalence of sets of statements involving the semantical elements of
its language, abstract numbers (such as integers, irrational numbers,
imaginary numbers), geometrical objects (such as points, lines, planes,
triangles, pyramids, vectors, tensors), sets, operators, etc. A theorem or
a proof is a unique syntactical element of the language of mathematics
which we will show can not be an element of the language of science. A
theorem or a proof  establishes, using logic, the equivalence of one set
of statements, which includes axioms, whose truths are assumed to be
self-evident and at times other theorems, which have already been proven
with the proposition whose truth is to be established by the theorem.
Science, on the other hand, establishes the veracity of a proposition
using the technique of the scientific method of observation,
generalization, hypothesis formulation, and empirical verification. The
scientific method is a unique syntactical element of the language of
science. In addition to trying to provide an accurate description of
nature science also attempts to describe nature in a systematic manner
using the minimum number of elements possible. The description of one
phenomenon in terms of another is often claimed to be an explanation. This
is one way to interpret this reduction of the number of basic elements
needed to describe nature which is a basic goal of science. Science also
endeavors to make predictions that can be tested to establish the accuracy
of its models. No matter how refined this process becomes and no matter
how many reductions and simplifications are made there always remain some
irreducible elements that resist explanation or description in terms of
simpler phenomena. The process of reduction has to end somewhere. The
basic elements in terms of which other phenomena  can be thought of as the
basic atoms or elements of scientific description (McArthur, 00).

Scientists often makes use of mathematical language to construct their
models of nature, especially in the physical sciences. They often employ
mathematical proofs to establish the equivalence of mathematical
statements within the context of their models. This has led to the popular
belief that science can actually prove things about nature. This is a
misconception, however. No scientific hypothesis can be proven; it can
only be tested and shown to be valid for the conditions under which it was
tested. Each proposition must be continually verified for each new domain
of observation.

The purpose of this paper is to make use of mathematical reasoning to show
and actually prove that science can never prove the truth of any of its
propositions or hypotheses. To establish our theorem, the Science
Non-Probativity Theorem, we will make use of a basic axiom of the
scientific method, namely, that for a statement or an assertion to be
considered as a scientific statement it must be tested and testable and,
hence, it must be falsifiable. If a proposition must be falsifiable or
refutable to be considered by science then one can never prove it is true
for if one did then the proposition would no longer be falsifiable, having
been proven true, and, hence, could no longer be considered within the
domain of science. We have therefore proven that science can not prove the
truth of anything. Let us repeat this argument using as a formal theorem
making use of two axioms.

The Science Non-Probativity Theorem

Axiom: A proposition must be falsifiable to be a scientific proposition or
part of a scientific theory.

Axiom: A proposition can not be proven true and be falsifiable at the same
time. [Once proven true, a proposition can not be falsified and, hence, is
not falsifiable.]

Theorem: A proposition can not be proven to be true by use of science or
the scientific method.

Proof: If a proposition were to be proven to be true by the methods of
science it would no longer be falsifiable. If it is no longer falsifiable
because it has been proven true it can not be considered as a scientific
proposition and hence could not have been proven true by science. Q.E.D.

In the spirit of the Science Non-Probativity Theorem, we can not be
certain that this line of reasoning is absolutely valid or true. After all
we have just used the theorem, a syntactical element of the language of
mathematics to establish a proposition about the language of science. Our
theorem is not scientifically valid but as a result of mathematical
reasoning we have created a useful probe; one that can lead to some
interesting reflections and insights into the nature and limitation of
science. If it helps scientists and the public, who tend to accept the
authority of science more or less uncritically, to adopt a more humble and
modest understanding of science, it will have served its purpose.

All that science can do is to follow its tried and true method of
observing, experimenting, generalizing, hypothesizing and then testing its
hypotheses. The most that a scientist can do is to claim that for every
experiment or test performed so far, the hypothesis that has been
formulated explains all the observations made to date. Scientific truth is
always equivocal and dependent on the outcome of future observations,
discoveries and experiments. It is never absolute.

A scientist who claims to have proven anything is being dogmatic. Every
human being, even a scientist, has a right to their beliefs and dogmas.
But it does not behoove a person who claims to be a rational scientist and
who claims that science is objective and universal to be so absolute in
their beliefs and in the value of their belief system, science. Scientists
are not immune to dogmatic and intolerant views as Dr. George Coyne (00)
has pointed out in his recent talk at the Humanity and the Cosmos
Symposium at Brock University, "When the Sacred Cows of Science and
Religion Meet".

I believe, it is altogether fitting and appropriate, that scientists
should display greater humility and tolerance in the practice of their
vocation and calling (Bertschinger, 00) in view of the lessons to be
learned from the following historical vignettes where well established
scientific theories and dogmas had to give way to newer ones:

Newton's theory of motion gave way to Einstein's theory of relativity when
one considers velocities that approach the speed of light. The Newtonian
picture also underwent major revisions with the introduction of quantum
mechanics needed to describe atomic systems. Neither the contribution of
Newton to science nor the validity of his model of dynamics for
non-relativistic and non-quantum events were in any way diminished by
these 20th century discoveries. In fact, many elements of Newton's theory
survived in both relativity and quantum mechanics and one can not imagine
how these theories could have been formulated without the pioneering work
of Newton. Even today's current version of quantum mechanics requires the
use of the classical Newtonian Hamiltonian to formulate the energy
operator.

Einstein helped to launch quantum mechanics with his explanation of the
photo-electric effect in 1905. Despite this pioneering work he turned on
the child of his own creation, quantum mechanics, claiming that it is an
incomplete theory. Einstein's objections have given way to the acceptance
by the main stream of the physics community of probability as being an
intrinsic part of our observation of nature due to the Heisenberg
uncertainty principle. Einstein's hypothesis that quantum mechanics is an
incomplete science can never be disputed or disproved according the
Non-Probativity Theorem formulated above. The usefulness of his
hypothesis, however, dwindles in the absence of any concrete progress
towards a complete non-probabilistic theory of quantum mechanics and
atomic systems. And this despite the valiant efforts of David Bohm, Roger
Penrose, and others to find the hidden variables or structures that they
claim would make quantum mechanics a complete theory. One can not but help
to conjecture that perhaps the reason that these variables are so well
hidden is that they do not exist. But this is only my conjecture and
belief and not anything I could prove. :-)

Einstein, Time magazine's man of the 20th century and whose name is now a
metaphor for genius, had no problem rejecting one of the elements of his
theory of general relativity. He introduced a cosmological constant into
his theory in 1914 to describe what he thought at the time was a steady
state universe. When Hubbell showed in 1929 that we lived in an expanding
universe Einstein immediately dropped this element of his theory. It has
since been resurrected by some contemporary cosmologists because they find
it might serve a useful purpose in their attempts to explain or describe
specific observations of the cosmos.

Another interesting shift in attitudes within the physics community is
illustrated by the recent emergence (pun intended) of chaos theory,
complexity, simplicity, plectics, emergence, and self-organizing
criticality all of which concern themselves with non-linear dynamic
systems. It was once claimed, not very long ago, that the complications
that non-linear equations presented were mere details not worthy attention
since the basic equations of motion, while not soluble, in closed form are
at least amenable to numerical if one needed to solve these equations.
With the availability of computers, especially microcomputers because they
provided researchers with low cost computing power that allowed them to
play, scientists were able to explore and examine the complexity of
non-linear dynamical systems and their sensitivity to initial conditions.
As a consequence many interesting results were arrived at and it is now
widely recognized that non-linear physics is not a special case or the
anomaly of nature but rather the norm that requires detailed attention.
The shoe is now on the other foot and it is realized that it is the
dynamical systems that can be described by linear equations that are the
anomalies.

In light of the Non-Probativity Theorem it is clear that the role of
science is to probe and not to prove. It is interesting that the two
English words, prove and probe, both derive from the same Latin root,
proba, which means prove. The words probability, problem and probable all
have the same root. This makes Einstein's rejection of probability in
quantum mechanics all the more ironic.

Science, the Language of Metaphor

Science involves the process of representing empirical observations in
terms of models many of which are mathematical models. These models
whether or not they are mathematical are metaphors and abstractions from
nature. The spirit in which the elements of scientific models are
described as metaphors is the same as that of the proposition that all
words of a spoken language are metaphors. The idea that all communication
is metaphor is an idea that "has ancient origins in oral cultures and has
been repeated and debated through history" (Gozzi, 00) by Plato, Vico,
Keats, Shelley and many modern linguists. McLuhan (1964) quotes
Quintillian "Nearly everything we say is metaphor."

Once a model is formulated then the scientists find mathematical and/or
logical relationships between the metaphors of their models. They then
find and prove that certain relationships exist between these metaphors
which serve as predictions of their models. The scientists then test the
validity of these relationships that they have discovered in their toy or
model world of metaphors and mathematics. By testing we mean that the
scientists try to match their new metaphors transformed or calculated by
their logical or mathematical operations with observables in nature. The
most one can say, ala Hume, is that the new metaphors transformed by
mathematics from the original metaphors of the starting model make a good
match to the observed phenomena of nature. This agreement supports the
scientist's model but does not prove that the model is correct because one
must leave open the possibility that the model can be falsified or
refuted.

If as noted above all words are metaphors and all scientific models are
also metaphors there is no need to prove scientific statements are true.
One can not prove a metaphor is true one can only test whether or not it
provides a useful description of nature which leads to greater insights
and in the case of science to more predictions. It is the natural process
of a language to evolve, the same is true of the meaning of words and
metaphors. Words are continually bifurcating keeping their old meaning and
taking on new meanings. The new meanings, however, carry with them
vestigially some of the structure or meaning of their ancestors just as
animals and plants vestigially retain structures from their ancestors.
Scientific theories models which are made up of metaphors also evolve and
bifurcate into new models which vestigially retain remnants or earlier
theories. Relativity and quantum physics still retain much of classical
Newtonian physics. Plus sa change plus ca la meme chose.

All models are abstractions from nature and hence represent a reduced
reality. Mathematical transformations of the abstractions or metaphors of
a model may further degrade their accuracy and reduce their match with
empirical reality.

The role of science is not to prove or even to explain the phenomena of
nature but rather to uncover patterns that relate one set of phenomena to
another. The mathematizing of scientific models and metaphors and the
process of subjecting them to mathematical operations has proven to be a
successful technique in uncovering these patterns especially when
predictions that can be observed or measured are made.

The Complementarity of Complexity and Predictability

The assumption that the metaphors contained in mathematical models used to
describe nature can then be operated upon using linear mathematical
operators to obtain new relationships among the elements of the model
which will then correspond to what is observed in nature is premised on
the notion that the relationship between the elements of the model and the
elements of reality are linear. This is an assumption or basic
presupposition which can not be proven mathematically but must be tested
empirically and can not be presumed to be necessarily true.

The effect of a non-linearity between the model and reality can become
magnified if the mathematical equations relating the elements of the model
are themselves non-linear. A small difference or non-linearity between the
mathematical model and the reality being modeled can lead to vastly
different outcomes ala the butterfly effect of Lorenz.

Quantum mechanics and the uncertainty principle have taught us that the
process of measuring nature at the atomic scale changes the phenomena we
are observing and scrutinizing. The modeling of nature using metaphors
introduces a new level of uncertainty in matching one's model with nature
especially when one attempts to represent the non-liner phenomena using
classical pre-chaotic physics.  Paradoxically the introduction of chaos
has led to the discovery of new patterns and insights into the nature of
non-linear dynamic systems ranging  from ecosystems to the origins of the
universe.

Within the new physics of chaos or complexity theory chaos or the
uncertainty associated with not being able to make predictions of the
behavior of non-linear systems leads, as Prigogene first suggested, to new
levels of order. The Heisenberg uncertainty principle in quantum mechanics
leads to an understanding of the wave nature of particles and the particle
behavior of light and by association to the wave behavior of the
probability amplitudes needed to describe atomic and sub-atomic particles
and make predictions about their behavior. Just as momentum and position
(or energy and time) play complementary roles in the Heisenberg
uncertainty principle, complexity and complementarity seem to  play a
similar role. When dealing with non-linear phenomena like the weather, the
greater the scope of a model the more complexity it can embrace but the
less predictability it incorporates and the greater is its chaos. This
parallels the Heisenberg uncertainty principle where the more one knows
about the momentum the less one knows about the position and vice versa. A
similar situation holds in dynamic modeling as well. The greater the
predictability of a model the less complex it is and the smaller the
number of elements that can be successfully modeled.  Consider
gravitational systems like the solar system. The two body problem yields
total predictability as the equations describing motion can be solved in a
closed form. With three or more bodies as the number of bodies increases,
the complexity increases and the predictability decreases. Complexity and
predictability are complementary in the same sense as momentum and
position within the context of the Heisenberg uncertainty principle.

The decrease in the predictability of a model of a non-linear dynamics
system because of the increase in chaos does not represent a shortcoming
of the model but rather an attempt to be complete by including the full
complexity of the phenomenon being represented. In the spirit of the
Non-Probativity Theorem there is no reason to believe apriori that a model
representing nature should be both complete and totally predictable.
Goedel's Theorem can serve as a possible model to better understand the
complementarity of complexity and predictability. Goedel's Theorem states
that a mathematical system can not be both complete and logically
consistent at the same time. If we think of predictability of phenomena as
a form of logical consistency with the basic laws of nature  and consider
complexity as a form of completeness then Goedel's Theorem also supports
the notion that total complexity or completeness of a model precludes
complete predictability.

The rejection of chaotics and complexity theory by adherents of the older
paradigms of Newtonian physics, relativity  and quantum mechanics is due
to the fact that the new physics places limitations on the predictability
of nature. Einstein critiqued quantum mechanics because he proclaimed,
"God does not play dice". The new physics is even more disturbing to this
new generation of skeptics who have to contend with the notion that not
only does God play dice at the atomic and sub-atomic level but he also
plays it at the macro level. As a consequence of this one must give up on
the notion of prediction of certain phenomena at the macro level,
something that not even quantum mechanics required despite the fact that
it made use of probability at the micro level. Equally disturbing to some
is the fact that the very existence of human life might also be the result
of a random roll of the dice.

The new physics places limitations on the ultimate ability of science to
predict certain phenomena critical to human survival such as the weather
and large scale climatic change no matter how sophisticated our
computational skills become. Buying into the new physics requires
accepting the fact that some problems are intractable. This requires a new
level of humility on the part of science which has enjoyed a period of
unprecedented success for over 500 years in which it has been able to
describe and explain almost every phenomenon it has encountered. Are we
willing to sacrifice the sacred cow of predictability and accept a more
modest role for ourselves in our quest for understanding our universe?
Will we accept a world view in which chaos and non-predictability are
regarded as natural outcomes of the complexity and diversity of our
universe, a richness which makes it possible for us to formulate this
dilemma. I believe that the next generation of physicists will happily
sacrifice this sacred cow and move on to a higher and deeper understanding
of nature in much the same way that the Hebrews gave up the golden calf at
Sinai and embraced ethical monotheism, not without becoming stiff-necked,
however. The only solace that can be offered to those who are disturbed by
the lack of predictability of the new physics is that events are still
causally connected but that at the edge of chaos where self-organizing
criticality takes place science will not be able to determine which new
form of equilibrium will emerge.

Conclusion

In this paper we have attempted to show the strengths and limitations of
science when regarded as a language with its dual role of communication
(description) and information processing (predictability). The
Non-Probativity Theorem underscores a long held belief that scientific
truth is not absolute but always subject to further testing. We have tried
to link the limitations on predictability within the framework of the new
physics of non-linear dynamics with the Heisenberg Uncertainty Principle
and Goedel's Theorem. We have suggested that the chaos and
non-predictability of complexity theory allows a more complete and fuller
description of nature.

References

Bertschinger, Edmund. 2000. The Call of Science: Theological Reflections
on the Ethics of Vocation. Humanity and Cosmos Symposium.

Coyne, George. 2000. When the Sacred Cows of Science and Religion Meet.
Humanity and Cosmos Symposium.

Gozzi, Raymond. 2000. Private Communication by E-Mail.

Logan, Robert K. 1986. The Fifth Language. Toronto. Stoddart Publishing

Logan, Robert K. 2000. The Sixth Language. Toronto. Stoddart Publishing

McArthur, Daniel. 2000. Cosmology, Creation, and Psuedo Questions.
Humanity and Cosmos Symposium.

McLuhan, Marshall. 1964. Understanding Media. New York: McGraw Hill. Ch.
14. p.127. Signet Edition.

Schmandt-Besserat, Denise. 1986. The Origins of Writing. Written
Communication 13.

Acknowledgments: I wish to thank the organizers, David Goicoechea, Keith
Sudds and Sylvia Baago and the particpants of the Humanity and the Cosmos
Symposium, January 20-22, 2000 sponsored by Brock University and the Brock
Philosophical Society where the ideas for this paper were incubated. I
would also like to thank George Coyne personally for his powerful key note
address which inspired the ethical dimensions of this paper. For their
insightful presentations and lively discussions I would like to
individually thank, Hugo Fjelstad Alroe, David Atkinson, Richard Berg,
Edmund Bertschinger, Leah Bradshaw, Bruce Buchanan, David Crocock, G.E.
Dann, Darren Domski, George Ellis, David Goicoechea, Anoop Gupta, Calvin
Hayes, Daniel McArthur, K. McKay, M.J. Sinding. I would also like to
acknowledge my colleagues on the Media Ecology listserv for a recent
stimulating discussion of metaphors with Jim Curtis, Raymond Gozzi,
Randolph Lummp, John Maguire, Eric McLuhan, Lori Ramos and Lance Strate. I
also wish to thank my colleague in the Physics Dept. at the U. of Toronto,
Prof. Ken McNeill and Marek Swinder, my former graduate student, for their
comments which improved this paper.
----




     --- from list technology-AT-lists.village.virginia.edu ---

   

Driftline Main Page

 

Display software: ArchTracker © Malgosia Askanas, 2000-2005