***************
Invited Symposium: Nonlinear Dynamical Systems in Psychiatry






Abstract

Section 1

Section 2

Section 3

Section 4

Section 5




Discussion
Board

INABIS '98 Home Page Your Session Symposia & Poster Sessions Plenary Sessions Exhibitors' Foyer Personal Itinerary New Search

Archetypal Dynamics


Contact Person: Dr. William Sulis (sulisw@mcmaster.ca)


Introduction

The purpose of this paper is to provide an ontological framework for the development of a mathematics suited to the modeling of systems in which information plays a pivotal role. Such systems are ubiquitous in nature, and include many quantum mechanical, biological, psychological, social, and computational systems. It will be argued that the dominant folk ontology which underlies much of mathematical modeling today, based as it is upon an objectification of the natural world, fails to capture the essential characteristics of such systems, and therefore provides an inadequate foundation for model building and theory. Instead, an ontology based upon the concepts of emergents and entities is proposed, leading to two new classes of dynamical systems models and an alternative foundation for a mathematics, based upon threads, links and tapestries to complement the traditional set theoretic approach.

The original inspiration for this work lay in an attempt to understand C.G.Jung's ideas on the existence and nature of the archetypes and the collective unconscious. Rather than becoming trapped in the debates over the evidence for and against the validity of these concepts as providing a realistic model of psychological dynamics, my focus instead turned to exploring whether a formal model could be found in which some analogue of the archetypes and the collective unconscious could be made manifest and their respective dynamics implemented in a consistent manner. While leaving unanswered questions concerning the validity of Jung's model, this would at least provide evidence as to its internal consistency and feasibility. The impossibility of such a formal model would provide strong evidence against the archetype/collective unconscious model while a demonstration of its feasibility would, at the very least, stimulate a renewed investigation into these ideas. An initial attempt to provide such a model was ba sed upon collective intelligence systems (Sulis, 1996b,1997). The ontological and pragmatic problems encountered during that work served as the inspiration behind this present undertaking.

Jung's conception of the archetypes and of the collective unconscious is richly textured, complex, conceptually deep and encompassing. Unfortunately it is also marred by obscure writing, inconsistent formulations, and an excessive tendency to eschew scientific observation in favour of narrative arguments. The evidence which Jung gathered in favour of his hypotheses came predominantly from alchemy, mythology, symbology and semiotics rather than from clinical or scientific psychology. His arguments depended heavily upon the occurrence of striking similarities and implausible coincidences across situations and times, rather than upon critical quantitative observation and analysis. This leaves the validity of his ideas and his methods open to criticism. Yet overall, the theory remains as one of the few modern attempts at a complete psychology, and one which is consistent with many, though not all, clinical observations. The idea of the archetypes has had a deep and lasting effect upon the humanities. For these reasons it deserves serious and sober consideration and evaluation.

Jung described many different archetypes, notably the shadow, animus, anima, self, mother and the trickster. He also posited a succession of archetypes underlying psychological development and the process of individuation. Later writers have elaborated these ideas. Pearson (1991), for example, describes twelve distinct archetypes: innocent, orphan, warrior, caregiver, seeker, lover, destroyer, creator, ruler, magician, sage and fool. The descriptions seem much closer in spirit to mythology and literature than to science. In part this association can be traced to Jung himself.

Throughout much of Jung's writings the idea of the archetype is treated as a primitive concept whose existence requires no proof but instead is simply assumed. Like all primitive concepts, it defies definition. The closest expression of a definition might be the following. "A more or less superficial layer of the unconscious is undoubtedly personal. I call it the personal unconscious. But this personal unconscious rests upon a deeper layer, which does not derive from personal experience and is not a personal acquisition but is inborn. This deeper layer I call the collective unconscious. I have chosen the term ``collective'' because this part of the unconscious is not individual but universal; in contrast to the personal psyche, it has contents and modes of behavior that are more or less the same everywhere and in all individuals. It is, in other words, identical in all men and thus constitutes a common psychic substrate of a suprapersonal nature which is pres ent in every one of us. .....Psychic existence can be recognized only by the presence of contents that are capable of consciousness. We can therefore speak of an unconscious only insofar as we are able to demonstrate its contents. The contents of the personal unconscious are chiefly the feeling- toned complexes, as they are called; they constitute the personal and private side of psychic life. The contents of the collective unconscious, on the other hand, are known as archetypes (Jung, 1935, pg 287)".

This definition provides merely an analogical or metaphorical referent through which the reader is expected to ground their conception of archetype. It does not provide an operational definition through which archetypes can be identified and studied. Jung believed that such operational definitions could not exist. Archetype was a true primitive concept. He wrote, "The archetypal representations (images and ideas) mediated to us by the unconscious should not be confused with the archetype as such. They are very varied structures which all point back to one essentially "irrepresentable" basic form. The latter is characterized by certain formal elements and by certain fundamental meanings, although these can be grapsed only approximately....It does not appear, in itself, to be capable of reaching consciousness. ...Since other archetypes give rise to similar doubts, it seems to me probable that the real nature of the archetype is not capable of being made cons cious, that it is transcendent, on which account I call it psychoid. Moreover, every archetype, when represented to the mind, is already conscious and therefore differs to an indeterminable extent from that which caused the representation (Jumg, 1935, pg 83)".

According to Jung the archetypes appear to be very similar to the Platonic ideals, in that we only ever observe them in symbolic or metaphorical form through images and imagery, but never in their native state. As a consequence Jung devoted much of his career to the study of symbols, imagery, mythology, and alchemy, attempting to gain some understanding of the archetypes through their transformations and representations. Jung provided a persuasive case for the existence of deeply embedded, unconscious influences shaping human behavior, expressed both individually and in socieites and cultures. He demonstrated the powerful roles that meaning, information, and symbols play in human affiars, clearly affirming their causal efficacy. Yet he seemed unable to accord them a fundamental ontological status, at least to the degree accorded to the so-called objective elements of reality. He struggled in his attempts to provide a causal explanation of the archetypes. He seemed unable to extricate himself from the objectivist, reductionist paradigm so prevalent at the time. As a result, his theory bears more features of theology than of science. He appeared to recognize this, writing, "Objective reality requires a math ematical model, and experience shows that this is based on invisible and irrepresentable factors. Psychology cannot evade the universal validity of this fact, the less so as the observing psyche is already included in any formulation of objective reality. Nor can psychological theory be formulated mathematically, because we have no measuring rod with which to measure psychic quantities. We have to rely solely upon qualities, that is, upon perceptible phenomena. Consequently psychology is incapacitated from making any valid statement about unconscious states, or to put it another way, there is no hope that the validity of any statement about unconscious states or processes will ever be verified scientifically (Jung, 1935, pg 84)".

Such a fatalistic prediction is wholly justifiable if an objectivist ontology, that is, an ontology based upon the concept of object, and a mathematics based upon it become the only framework for the generation of formal models of reality. Objects are those aspects of reality which persist through time, whose existence is independent of that which observes it, which interact with the rest of reality in a passive, nonintentional deterministic manner, and whose behavior and properties exist independent of the context within which they are embedded. As will be argued in this paper, the objectivist ontology applies to only a small fragment of reality, and those aspects to which it does not apply possess as much reality as those to aspects to which it does. Moreover a formal, mathematically explicable, scientific understanding may be achieved for these non objective aspects as much as for the objective. But to do so requires a new ontology and a new formal language. The search for such an ontology takes us to t he heart of reality itself.

Metaphysics and ontology have acquired a negative stigma in the sciences, where they are accorded little significance because it is believed that the questions which they engender have either been answered, or are unanswerable. They are tossed with disdain into the dustbin of speculative thought. Yet, as Brain Cantwell Smith (1998) has recently demonstrated, ontological questions have acquired serious practical importance. Many of the most perplexing pragmatic problems in computer science are in fact ontological problems and their solutions involve coming to grips with ontology, metaphysics, and with questions of meaning. Science has become a victim of its own success. The program of objectification of reality has led to spectacular successes within the physical sciences but it has failed to produce significant results in those fields in which information and meaning play fundamental roles. Science, with its emphasis upon universal truths, has provided us with a worldview which is sterile, mere form, witho ut depth or substance. The reaction against science at the end of this century is less a retreat into irrationality than it is an attempt to restore meaning to its rightful place. It is not a reaction against science per se, but it is a reaction against the orthodox objectivist ontology which so informs much of modern science. It is a goal of this paper to attempt to demonstrate that a widening of ontological scope need not result in a loss of scientific rigor or of validity but will in fact ensure that validity is maintained as science attempts to address the problems of the life and social sciences.

This paper is not about the fundamental nature of reality. That is a subject which has a long intellectual history without any satisfying resolution. Instead, the focus of this paper is more limited. It is concerned with the folk ontologies which provde a base of meaning for the theories and models which scientists, and in particular mathematicians create in order to describe and ultimately to understand reality. It is about the ontologies used in our everyday lives, which have the greatest impact on theory. Few scientists are philosophers, and few are guided in their work by deep philosophical argument. Instead, most are guided by intuitions and the implicit ontology inherent within our culture. It is this practical ontology which requires explication since it bears the greatest influence. Emphasis is placed upon mathematics, since mathematics provides a common language for the expression of theory and models in science. Biologists, psychologists, and social scientists have, for the most part, eschewed ma thematical language, and the reasons for this will be traced to the inadequacies of the objectivist program. The detailed arguments necessary to support the contentions made above are lengthy, and are currently being developed as a book, of the same title as this paper. In this paper I will present a much simplified sketch of the central argument, and to offer to the reader a new conceptual foundation for a mathematics for the social and information sciences.

The main thrust of this paper is captured in the following six points.

  1. Reality is irreducible. There is no ultimate foundation upon which all of reality rests and from which all knowledge can be deduced.
  2. Reality is complex. Reality functions through interaction, not action and reaction and therefore information and meaning play pivotal roles.
  3. Reality is holistic. Figure-ground separations are arbitrary and pragamatic, not causal. Reality is deeply contextual.
  4. Reality is pluralistic. There is no fundamental theory of everything because reality transcends linguistic/symbolic description. Instead, there are theories of everything. Reality admits multiple, irreducible, yet complete ontological frames of reference, that is, multiple fragmentations into meaningful, causal units linked through a coherent dynamics. These frames may be interdependent, but they are separate. Archetypal dynamics is the term coined to expresse this deep structural - linguistic complexity.
  5. Reality is pattern. Transient, coherent patterns provide the basic substrate of reality. Phenomena and epiphenomena, in as much as they possess causal influence, provide the proper entities for observation and study.
  6. Reality is vital. Objects constitute a highly specialized subcomponent of reality. Emergents and entities comprise a far greater share of reality than do objects, and warrant at leats equal status.

In the next few sections I will briefly review the major shifts in folk ontology which have taken place over the centuries and the evidence which is forcing a reappraisal of the current objectivist ontology. Following that, I will present the main features of the entity based ontology and the concepts of thread, tapestry and link as a foundation for an entity relevant mathematics.

Back to the top.


Ontologies Mythical and Scientific

Mythological Ontology

The earliest folk ontologies and metaphysics can be found in the creation myths of ancient societies. Myth provided a medium through which an understanding of the nature of reality could be achieved and passed on to others. These early societies lacked our formal conceptions of theory and of model and instead constructed elaborate narratives, called myths, which drew heavily on metaphor and imagery. Lacking sophisticated technologies, the reality which these societies faced was a living one, constantly changing and demanding continual adaptation. Their world was dominated by interactions, both among themselves and with the environment in which they found themselves. The agencies with whom they interacted were intentional agencies, having their own unique motivations and goals. To deal with such agencies required a finely honed psychological understanding, a deep empathy, not only of themselves, but of the other creatures who shared the world with them. Their survival depended upon understanding the behavi or of their prey, in order to maintain food supplies, and of their predators, in order to avoid becoming food themselves. They also needed an understanding of the myriad other features of their environment, especially the weather, and the flow of water.

Theirs was a world of change and great care and attention was paid to whatever regularities might be discerned in the ebb and flow surrounding them. They utilized their deep empathy in constructing their myths, their narrative ontologies. In so doing, they projected their psyche onto the world as a whole. This was very apparent to Jung who wrote "In mythological research, we have contented ourselves until now with solar, lunar, meteorological, vegetal, and other comparisons. But we have almost completely refused to see that myths are first and foremost psychic manifestations that represent the nature of the psyche. The mind of the primitive is little concerned with an objective explanation of obvious things; it has an imperative need - or rather, his unconscious psyche has an irresistible urge - to assimilate all experiences through the outer senses into inner, psychic happening. The primitive is not content to see the sun rise and set: this external observation must at the same time be a psychic even t - that is, the sun in its course must represent the fate of a god or hero who dwells, in the last analysis, nowhere else than in the psyche of man. All the mythologized occurrences of nature, such as summer and winter, the phases of the moon, the rainy seasons, and so forth, are anything but allegories of these same objective experiences, nor are they to be understood as "explanations" of sunrise, sunset, and the rest of the natural phenomena. They are, rather, symbolic expressions for the inner and unconscious psychic drama that becomes accessible to human consciousness by way of projection - that is, by being mirrored in the events of nature (Jung, 1935, pg 54)".

Jung viewed myths as being a window into the psyche of primitive man. Accepting the objectivist ontology, Jung, like virtually all modern thinkers, dismisses the surface structure of myth as being patently false, and constituting irrational projections of internalized psychological processes onto the external reality. The reality described in myth became a mirror to the soul of those who created it. In adhering to this perspective, Jung was led to postulate the existence of the archetypes and the collective unconscious as a means of explaining some of the commonalities which he observed in his comparisons of mythology from different cultures. However, in doing so, Jung failed to appreciate an even deeper aspect of myth. The unconscious mental processes which direct us did not arise de novo, but instead evolved over millions of years in close connection to the physical environment within which our ancestors lived out their lives. In order to maximize our chances at survival, many of the deep characteristics of that envirnoment must be embedded somehow in the structure, organization and decision making of those unconscious processes. The soul becomes a mirror to reality. Myth therefore becomes a projection of reality onto itself, transformed through the medium of the unconscious. The challenge is to separate those aspects of myth which reflect deep aspects of reality, from those aspects which reflect the filtering medium, the unconscious.

Most germane to my interest here is that particular subcategory of myth called the creation myth. " A creation myth conveys a society's sense of its particular identity; it reveals the way the society sees itself in relation to the cosmos. It becomes, in effect, a symbolic model for the society's way of life, its world view - a model that is reflected in such other areas of experience as ritual, culture heroes, ethics, and even art and architecture (Leeming, 1994, pg. vii)". Creation myths thus provide an expression of a society's basic ontological viewpoint. Consider the following creation myth, one variant of a set of myths which evolved in the regions of southern Siberia. " When Ulgen saw mud floating on the waters of pre-creation, he saw a human face and gave it life. Thus the first man, Erlik, was born. Soon Erlik forgot his place and boasted that he could create man as well as Ulgen could. Ulgen reacted by flinging his first creation into the ends of the earth, where he reigns as the devil. After the fall of Erlik, God created the earth and placed eight trees and eight men on it. The eighth man, Maidere, and the eighth tree stood on a mountain of gold, and at Ulgen's bidding Maidere created the first woman. When he saw that he could not give the woman life, Maidere left her in the care of a furless dog and went to get help from Ulgen. While he was away, Erlik came and offered that dog a fur coat in exchange for a look at the woman. He not only looked, but he also playe d seven flute notes into her ear, and she came ot life possessed of seven tempers and many bad moods (Leeming, 1994, pg.7-8)".

The leading candidate for a universal characteristic of early creation myths is that the causal agencies involved take the form of living creatures which are capable of richly textured, emotion laden behavior. The creator is not a dispassionate, rational, passive, or mechanical agency but rather is full of emotion and subject to sudden unpredictable twists in behavior. The creator has past history which influences its decisions. The act of creation is considered to be an act of decision making; that is, it is a realization of but one of many possibilities, shaped by past history and by current context. These early peoples possessed technology, and primitive though it might be, they had ample experience with artifacts and objects. They understood many aspects of the creation of structure and form by human intention. Yet they chose to represent casual agencies with the attributes of living entities. I would argue that such a universal choice was not a mere projection of psyche but instead reflected a deep ap preciation of the reality which they experienced. For these early peoples, there was an implicit, and perhaps explicit, conception of the process of creation as being historically and contextually contingent and subject to random influences. Yet creation was not without order. Reflected in such myths is a recognition of the necessity for specific initial conditions together with a certain lawfulness governing the interactions which follow. Early peoples saw reality as the manifestation of entities, not objects.

"There are in fact some societies which regard myth as reality itself, more real than the objective universe. We find it difficult to understand such an attitude, which seems pathological in origin - or what is called pathological, although it is often nothing but an aggravavtion of the normal. So, when an individual is possessed by myth, we are apt to say that he is ill; when entire societies are affected, we tend, like the philosophers of old, to shrug our shoulders and speak of prelogical mentality(Grimal, 1965, pg 10)". These primitive societies might well be prelogical, yet they survived for thousands of years. Individuals within these societies may not have fared better than individuals of any species, but the societies prevailed, in part, I would argue, because they understood the nature of the world in which they lived. However, the lived world is not a technological world, and as technology spread there came with it an increasing need to understand and to master it, just as one had previ ously had to master the living world. The characteristics of technological entities are not the same as those of living entities (Sulis, 1994). A reality of entities was superseded by a reality of objects. It is to this process of the objectification of reality that I now turn.

The Objectification of Reality

The shift from a vitalist to an objectivist ontology can be traced back at least to the time of the great Greek philosophers, most notably Plato and Aristotle. "To the Hellenic mind logos and mythos, `reasoning' and `myth', are two antithetic modes of thought. The former includes everything that can be stated in rational terms, all that attains to objective truth, and appears the same to all minds. The latter includes all that concerns the imagination, all that cannot be subject to verification, but contains its truth in itself or, and this amounts to the same thing, in powers of persuasion arising out of its own beauty (Grimel, 1965, pg 97)". In Greek culture we see both modes of understanding in co-existence, but the attitude towards myth has already undergone a considerable shift. Myth is no longer held as a direct expression of an ontological viewpoint, but instead is merely utilized as a medium through which to express deep philosophical ideas. Myth becomes allegory, metaphor, not to be tak en literally. Myth begins to merge with art, gradually evolving into fiction and fable. Prelogical thinking gives way to logical thinking.

The objectification program appears to be a uniquely Western tradition. Eastern and Aboriginal ontologies retained many vitalist concepts well into this millenium. One reason for this may be the excessive attraction to and dependence upon technology which so characterizes Western culture. Technology was highly developed by the times of the ancient Greeks, and elevated to a position of high esteem by the ancient Romans. Both cultures valued the eternal, the Greeks in ideas, the Romans in structures. Roman culture produced few great philosophers reflecting the differences in values. The ancient Greek philosophers emphasized the search for everlasting, eternal, universal Truth. Plato, as did Jung subsequently, believed that such truth lay in the Ideals, universals which lay forever outside of the grasp of mankind, hidden always by the particulars of their expression in the material universe. Mathematics, in as much as it deals with an ideal universe of pure thought and number, was thought to provide a window through we could catch a glimpse of some of these Ideals. Such an attitude has imbued mathematics with both a sense of mystique, as well as of divine authority, which continues to hold even today. This has been both a blessing, and a curse, providing mathematics with a sense of aesthetics which, judging by the current debate over the role of proof and experiment in mathematics, may have outlived its usefulness. In any event, while mathematicians may have felt an affinity with Plato, the technologists and scientists to be found their champion in the writings of a pupil of Plato, Aristotle.

Aristotle was possessed of a brilliant mind, which encompassed all of the philosophical disciplines. He laid down the fundamental principles of logical reasoning, and founded the discipline of metaphysics, providing a foundation for virtually all of the natural sciences which lasted almost 1500 years. Aristotle, in his study of entities, introduced the four fold classification of causes (formal, material, efficient, and final). In his ontology, he introduced a four fold classification of entities, baed upon two independent dichotomes: entities in a subject/not in a subject, and entities said of a subject/not said of a subject. He took as the ultimate realities those which are neither in a subject nor said of a subject, in other words, those things which are always subjects and never predicates. Aristotle clearly gave preeminence to objects, and defined reality in objective, materialist terms. He further emphasized the need to distinguish primary (particular) substances from secondary (universal) substance s and argued that primary substances are logically prior; all other entities depend upon primary substances for their existence.

Aristotelian thought became dogma under the influence of the Church which was instrumental in maintaining its position of prominence. Nevertheless, the program of objectification of reality was not universally accepted. Throughout the Middle Ages and the Rennaissance, and even up to the time of Newton, vitalist traditions persisted in the field of alchemy. Alchemy, at least in the history of science literature, has long been regarded as the supreme example of pseudoscientific folly. Yet, as Jung so ably demonstrated (Jung, 1955), alchemy was a complex philosophical discipline whose origins lay deep within the psyche of man. Just as in the case of myth, Jung was able to show that alchemical ideas were a projection of internal psychological processes and relationships onto the external reality. Jung saw value in alchemy through its ability to illuminate the minds of those thinkers who created it, and therefore to illuminate the deep passages of unconscious thought. Again though, Jung failed to appreciate tha t alchemy was also an ontological system which mirrored reality through the distorting medium of the unconscious. In alchemy we find a strong emphasis upon relationships and transformations, fundamental aspects of the living world. The failure of alchemy stems primarily from its misapplication to those aspects of reality which were most objective in nature, instead of those which were most vital. Alchemy was best suited to describing a reality of living entities, not a reality of dead objects. Alchemy failed because its ontological prescriptions did not fit its subject. As we will see later, the objectivist program too is failing in the life and social sciences for much the same reason.

Although after the time of Galileo, many of the specific details of Aristotle's metaphysics began to be called into question, its spirit remained alive and influential. It achieved its ultimate realization in the work of Isaac Newton, who combined meticulous observation, and careful logic with formal mathematics and created the Principia Mathematica. This work, above all others, demonstrated the power of the objectivist approach. In it, Newton reduced reality to mathematical points exisiting in a framework of absolute space and time. Newton built his physics upon the concept of the object. Newton created a highly successful, predictive theory, and mathematics became the language of reality. The Newtonian paradigm held promise of providing an ultimate theory of reality, and even today it still exerts considerble influence.

Until the time of Newton, mathematics consisted mostly of number theory and geometry. Mathematics was almost purely descriptive in its orientation. After Newton, with the development of the calculus, mathematics was given power over time and space, the ability not merely to describe, but to predict as well. With that power came a deep sense of responsibility and of disquiet, because the original formulation of the calculus was more art than science. Newton's formulation of the calculus bears subtle traces of his alchemical interests. This proved unacceptable to later generations of mathematicians, especially those living in Germany in the 19th century. Following Plato, mathematics was viewed as being capable of revealing deep truths about reality, and therefore it was of the utmost importance that the procedures used in mathematics follow the highest standards of rigor and logic, and that the foundations of mathematics be free of confusion, uncertainty, and error. Under the influence of mathematicians of t he stature of Dedekind, Weierstruass, Hilbert and Cantor, a rigorous foundation for mathematics based upon upon the concept of the set and a standard of evidence based upon deductive reasoning were established. From this work grew the hope that all of mathematics would someday be shown to be deducible from a small collection of fundamental axioms. ]

The notion of a set depends crucially upon the notion of an object. A set itself is a kind of object, and has meaning only in relationship to other objects. A set has an independent, objective existence, eternal, passive, acontextual. It has but one property, that of membership, which it possesses independent of context. However the property of membership requires the existence of things which are capable of being members, and if the set itself is to be persistent, objective, and acontextual, so must those things which it contains. Thus the notion of set is inextricably linked to that of object, and so it is no wonder that the concept of set might be looked upon to provide a foundation for an object oriented mathematics. Set is a powerful concept, apparently encompassing virtually everything which would be of interest to mathematicians and scientists alike. The idea of set has become so deeply ingrained in our thinking, especially as a result of the so-called new mathematics programs of the 1960's. that it seems impossible to conceive of anything else as being equally fundamental.

Psychology, the life, and the social sciences began the last century driven by vitalist folk ontological views. Fiscal and social pressures, stemming in part from an overvaluation of the successes of the objectivist paradigm in the physical sciences, have driven these disciplines to abandon their vitalist roots and attempt the objectification of their own subject matter. Species have given way to genes, organisms to macromolecules, minds to neurons, and qualia to neurotransmitters. Disciplines deeply rooted in human experience have rejected the validity of that experience in favor of objectified caricatures and simulacra. Progress has been made in undertanding the caricatures and simulacra, but not in the subjects which they are purported to support or represent. In that regard, these new sciences have proven to be remarkably sterile. Unlike alchemy, whose misapplication led to patently false predictions, this modern misapplication of an inappropriate ontology has simply left nothing of interest behind it . The modern theory of complex systems provides another striking example in which the continued misapplication of the objectivist paradigm had led to great promise, (some might say hype), and little progress. Clearly a new ontology is called for.

At the beginning of the first millennium, vitalist ontologies gradually gave way to objectivist ontologies, the dominant folk ontology in use today. At the end of the second millennium, these objectivist ontologies have begun to come under attack from a wide variety of sources. It is to a review of this evidence that I now turn.

Back to the top.


Cracks in the Armour

Even as the Newtonian paradigm was solidifying into dogma, cracks in the armor were beginning to appear. These blows came from a variety of sources: biology, psychology, mathematics, computer science. The most serious came from within physics itself through the discovery of the theories of relativity, and of quantum mechanics.

Physics

The discovery of special theory of relativity by Einstein in 1905 and its experimental verification demonstrated in a most convincing manner that there was no absolute space, nor any absolute time. These two fundamental aspects of Newtonian theory were found to be inconsistent with the relativistic understanding that there was no preferred frame of reference from which one could observe kinematical behavior, that is, motion. Our observations of space, of time, of mass, and of energy, were found to depend upon the motion of the observer. Although properties remain characteristic of objects, their measurement, their quantitative valuation, become conditional upon certain characteristics of the observer.

The general theory of relativity had a more subtle effect upon the Newtonian program. The concept of force as a causal agency plays a crucial role in Newtonian mechanics. General relativity dispensed with the concept of force, replacing it with a purely geometric representation. The universe became considerably more passive in its actions. Moreover, the inseparability of space and time, together with a rigid adherence to the idea of causal determinism, resulted into a static universe, one in which there is no past, present, or future. General relativity provided the first example of a contextual theory, in which the motion of a system depends upon the structure of the entirety of the space and time within which it is embedded.

Quantum mechanics raised even deeper questions about the relationship between measurement and property, and about the fundamental nature of reality. and the object to which these are attributed. The quantum world is irreducibly contextual. The two slit experiment remains to this day the archetypal example of this. A beam of electrons is allowed to pass through a screen in which are placed two slits, and then to travel to a second screen which records the impacts. If both slits are open, the pattern of impacts on the recording screen exhibits a series of peaks and valleys, with a valley at the center. Such a pattern is identical to that which would be observed if a wave had passed through the first screen, interacting with itself on its way to the recording apparatus. However, if one of the slits is closed, then the pattern of impacts has but a single peak, centered directly across from the open slit. This is the pattern which would be obtained if a beam of small rigid bodies had passed through the open sli t. Yet the beam itself is never changed! This result occurs even if the electrons are allowed to pass through the slit(s) one at a time and a cumulative recording of impacts is obtained.

As if this were not perplexing enough, suppose that one were to place a marker on the recording side of one slit so that the particles which pass through it are made distinguishable from those passing through the second slit. In this case, the interference pattern disappears even when the marking process does not interfere in any way with the motion of the particles which are marked. In fact, the interference pattern will disappear. even if the marking is just a possibility, without actually being carried out. No physical interaction need take place between the marker and the particle beam in order for the interference pattern to be destroyed. It is the context alone, the potentiality of measurement rather than the actuality of measurement, which determines which pattern will appear.

Recently, experiments with photons have demonstrated the extent to which context influences behavior. It is possible to set up an apparatus in such a way that a beam of correlated photons can be split so as to follow several, initially indistinguishable paths. These beams can then be combined, resulting in the formation of interference patterns at a measurement device. A marker can then be introduced along one of these paths, making those photons distinguishable from the others and destroying the interference pattern. Again the motion of the photons in unaffected by the marking process, which merely alters their angle of polarization. It is the informational aspects of the situation which determine whether or not the interference pattern is observed. Most surprising though has been the demonstration that this marking can be carried out after the photons along the second path have already been received at the detector! Moreover it is possible to introduce a so-called quantum eraser, which can remove the pat h information after the marking has taken place and restore the interference pattern, even though the second path photons have already been detected. The context of a quantum system must be understood broadly, and includes both spatial and temporal information. The dynamical character of a quantum system, unlike its classical counterpart, is no longer an intrinsic property of the system but instead depends upon the context within which it is expressed (Goldstein, 1998a, 1998b).

These results demonstrate the validity of the Copenhagen interpretation of quantum mechanics which asserts that a quantum measurement is not a measurement until it is an observed measurement, that is, until a classical measurement apparatus has recorded it. Quantum measurements possess a deep duality. Quantum behavior is governed by the general laws of quantum mechanics, which involve uncertainty and probability in a fundamental manner. The behavior of quantum objects is described by wave functions, which provide probability distributions over sets of observables, such as position, momentum, energy. Classical mechanics is governed by the laws of Newton, which are fundamentally deterministic. The behavior of a classical object is described by a function which provides a single observable for each instant of time. These two theories adequately describe the realms for which they were devised, yet are incompatible, both ontologically and mathematically. Nevertheless, the measurement of a quantum object depends in a deep way upon the classical properties of the apparatus which measures it. The very existence of a measurement appears to depend upon the objective properties of the measurement device, in particular its persistence and context independence. Quantum measurement involves two irreducible ontological domains, the quantum and the classical. A quantum measurement requires the binding of a quantum system to a classical system, resulting in an identifiable change of state in the latter. This binding need not be physical. It is enough that it be informationally rich.

In one sense, the quantum level determines the classical level, since presumably all classical objects consist of large collections of quantum objects held together in suitable arrangements. Nevertheless in another sense, the quantum level is determined by the classical level, since specific behavior is not defined until a classical context has been established, and specific measurements cannot be made unless a classical measurement device is present. We have a situation in which there exist two distinct ontological frames of reference, yielding distinct modes of behavior and causation, distinct webs of meaning and description, neither reducible to the other, and yet which coexist, intermingle, and are codependent.

One can observe similar ontological dualities in a variety of situations. Fox and Elston (1993) have demonstrated, both formally and experimentally that in certain nonlinear systems the underlying nonlinear dynamics can cause the amplification of low level fluctuations to the point that a detectable fluctuation appears at the macroscopic level. Classical operators such as the average no longer exist, so that there is no simple description of the macroscopic level in terms of the microscopic level. Conversely, any description of the macroscopic behavior must include microscopic observables as well or these fluctuations cannot be accounted for. Information from at least two distinct ontological frames of reference must be provided in order to adequately describe and predict the behavior of the system at even a single level.

Mathematics provides another example in the two dimensional cellular automaton, Life. A two dimensional cellular automaton consists of a large, possibly infinite, two dimensional grid, resembling a chessboard. Each cell of the grid can be marked with one of a finite number of states, say in the case of Life, alive or dead. At any given time, the pattern of living and dead cells forms a state of the cellular automaton. In order to determine what the subsequent states of the cellular atuomaton will be, one applies a single rule simultaneously to all cells. In the case of Life, the rule is simple. Select any grid cell to be updated. Examine the state of every adjacent grid cell, and count the number of living cells. If the grdi cell in question is alive, and if it has two or three neighbours, then it will remain alive at the next time step, otherwise it dies. If it is dead, and it has exactly three living neighbours, then at the next time step it will become alive (born), otherwise it remains dead. This rule is fully deterministic and its repeated application enables one to determine any future state of the cellular automaton from any given state. That it, in principle, the dynamics of this system is wholly understood.

However there is a surprise lurking inside this simple picture. The mathematician J.H. Conway, while searching for the presence of non-repeating patterns, demonstrated that there existed certain localized patterns of states of cells which could be arranged in such a manner as to interpreted as switches within a digital computer. Furthermore he demonstrated that these switches could be arranged in such a way as to actually simulate the actions of a digital computer. An initial configuration of these switches could be chosen which would correspond exactly to the input of a specific program, and data, and the subsequent evolution of the Life system would correspond exactly to the activities of its matching digital computer as it ran the program using the data. The final configuration of the Life system would correspond to the final output of the computer, and could be read directly from an examination of the final configuration of cell states. That such a possibility existed was a wholly unexpected result, an d has given rise to the concept of emergence, the presence of a high level behavior or phenomenon which is not predicted from an understanding of the low level dynamics. Emergence has become a key concept in the theory of complex systems, but, like the use of the term chaos in nonlinear dynamics, it is a term which is misplaced.

If one carefully examines the Life scenario, one realizes that in point of fact, nothing emerges out of the low level dynamics. The presence of the association between these patterns and the digital computer requires the selection of a highly special set of local initial configurations, all arranged in a highly specific manner. Moreover, that arrangement depends crucially upon the existence of an ontological frame of reference, i.e. the digital computer and its associated concepts which exists outside of Life. There is nothing intrinsic to these patterns which provide them any significance or meaning. They are not intrinsically generated but instead must be imposed from outside the Life system. There is no true emergence here, no higher level structure arising out of the depths. Instead, what we have is the presence of two irreducible ontological frames of reference, the low level automaton dynamics, and the high level computer dynamics, which describe one and the same system. Life provides a striking exam ple of ontological duality, and what is commonly referred to as emergence should more fairly be described as janusification, after the two faced god, Janus.

Note that there is no simple mapping between the high and low level descriptions of this system since different pattern configurations could give rise to the same computations and in addition, even with a fixed arrangement of pattern elements, the underlying cells which support these patterns are not predetermined. One can freely choose the supporting cells so long as the high level organization is preserved. The cells merely act out the roles assigned to them, and they are free to take on any role potentially. Both the macroscopic and the microscopic descriptions are necessary in order to describe and give meaning to the dynamics of the system.

Crutchfield and Hanson (1993) used a related approach in their work on the analysis of one dimensional cellular automata. They decomposed the global patterns of automata into coherent local subpatterns which they term particles, and then developed a theory of the dynamics of these particles relative to each automaton which provides a description of the dynamics of the whole. These particles are extracted from the original pattern by identifying broad regions of related patterning, much like a wallpaper pattern, and then deleting these regions from the original, leaving behind only the boundaries between the regions. These are virtual particles. Nevertheless, once the particles have been separated out from the background, they can be observed to possess a rich and complex dynamics of their own. Moreover significant information about the behaviour of the original system can be obtained from a study of the particle dynamics alone. Criticism of their work has emphasized the virtual nature of these particles, but fails to take note of the janusification inherent in the descriptions of these systems, and thus misses a fundamental feature which Crutchfield and Hanson implicitly capitalize upon.

Many other attributes of objects fail to hold in the quantum world. Classical objects are localized in space and are governed by local forces. Action at a distance is denied by the theory of relativity. Indeed, the consequences of relativity theory demand that objects be localized in space and in time. Objects which are sufficiently separated in space and in time must therefore be dynamically uncoupled. This realization formed the basis of the famed Einstein-Podolsky-Rosen (1935) thought experiments, which were conceived as a means of demonstrating the incompleteness of quantum mechanics as a physical theory. In these experiments, two particles are created in such a manner as to have their quantum states entangled, so that they possess identical, or at least strongly coupled quantum states. They are then separated from one another to a great distance and measurements of a pair of complementary observables are made, one for each particle. Complementary observables are linked through the Heisenberg uncertain ty relations, so that in theory the measurement of one precludes the measurement of the other. By separating the particles to a distance great enough that, according to relativity theory, a signal could not pass from one to the other in the time taken to carry out the simultaneous measurements, it should in principle be possible to measure to any degree of accuracy, both complementary observables, in violation of the Heisenberg relations. At the time that Einstein, Podolsky and Rosen put forward their ideas, the experimental tools needed to carry out the experiment did not exists. Following the work of Bohm and Bell, Aspect and others have been able to carry out analogues of the EPR experiments using correlated photons, and quantum mechanics has been vindicated. No matter how large the separation becomes, the particles in an EPR experiment remain quantum correlated unless disturbed by some interaction. These two particles behave as if they were a single object, extended over space and time, held together by nonlocal informational connections.

Another characteristic of objects is that they are unable to hold mutually contradictory properties simultaneously. Schrodinger's famous thought experiment was meant to suggest a fundamental inconsistency in quantum theory. In this experiment, a cat is placed in a closed experimental apparatus and its wave function coupled to that of a nuclear decay process. There is a small pellet of cyanide attached to a devices which releases the pellet in the event that an alpha decay occurs, resulting in the subsequent death of the cat. The entire apparatus is isolated from the outside world. The quantum superposition principle applied to the cat - nuclear decay system yields a coupled system which exists in a superposition of two states - one in which an alpha particle has been emitted and the cat is dead, and one in which it has not and the cat is still alive. The cat is forced through the quantum coupling to also exist in a superposition of states - dead and alive, simultaneously. In the classical world such a sit uation is impossible. Nevertheless researchers (Haroche, 1998) have recently carried out a set of experiments involving single rubidium ions which were prepared in a macroscopically measurable quantum superposition of incompatible states. These ions are observed to shift back and forth between these states in an apparently stochastic manner, and can fairly be said not to reside in either state individually, but rather in both states simultaneously. Quantum objects do not possess determinate properties, only potentialities, which must be realized in some context. Computer Science

Computation had long been an important aspect of mathematics, but it was not until the early part of the Twentieth Century that the study of computation came into its own as a major mathematical discipline. Turing, Church, Kleene and many others helped to formalize and delimit the concept of computation. Recursion theory was born, and with it, the class of computable functions, which proved to be but a subclass of the class of all functions. Automata theory and mathematical linguistics also developed during this same period of time, providing a formal basis for computing devices and for Chomsky. Von Neuman established a fundamental architecture for computing devices with the realization that the symbol sequences which formed the basic object of manipulation in computation required two distinct modes of interpretation: as a set of instructions to be carried out, and as data to be processed. He also laid out the basic serial architecture of computing hardware based upon the formal constructs of recursion the ory. Originally, computer science was concerned with finding effective procedures for carrying out long and complex arithmetic computations. However it was soon realized that computers provided an important tool for the processing of non numeric data and this quickly supplanted earlier uses. Calculation too soon gave way to simulation as a task of considerable importance in both science and industry. The pragmatic importance of computers has provided the driving force behind the emergence of computer science as a scholarly discipline in its own right. In a somewhat paradoxical state of affairs, the most pragmatic of all scientific disciplines has forced a return to that most abstruse of all philosophical disciplines, ontology. It turns out that many of the deepest problems in computer science hinge upon an understanding of ontology. Brian Cantwell Smith (1998) has carried out a detailed exploration of the myriad of ontological questions which now plague the field of computer science. Physicists have been able to avoid their ontological problems by arguing for the primarily computational nature of physics. Computer scientists are not so lucky. The ontological problems of computer science have profound implications for the practice of computer science, since they affect the very practical business of writing effective software.

Many ontological questions involve understanding the nature of data types, and the relationship between names and meaning. In the real world, data is often interpreted in a multiplicity of ways, often changing throughout the course of a given project. As the interpretation changes, so does the appropriate set of transformations to which the data is subject. Meaning plays a critical role in real world applications of computations, and an understanding of meaning requires an understanding of ontology. Even the most basic of all computational objects, the program, raises deep questions upon closer scrutiny. Exactly what is a program? It is the formal code which the programmer writes in some high level language? Is it the machine language code created after compilation? It is the algorithm, the organization of computational elements, upon which the formal code is based? It is the actual sequence of operations which occurs when implemented on an actual computer? If so, how does a program on one computer compare to the same program run on a different computer? What meaning does same have in this context?

Computer science has opened the door, through simulation, to the concept of virtuality. Virtuality has long been implicit in literature, reflected in the need for a suspension of disbelief on the part of the reader of fiction. Computer science has elevated this to a new level through the simulation of real world processes, through the development of virtual reality environments, through the development of autonomous agents, artificial intelligence, and now artificial life. Through the use of suitable effector devices, these virtual agents become capable of bringing about physical change in real world environments. As effective causal agents of change, they no longer appear virtual, but begin increasingly to acquire the characteristics of real agents. Virtuality aside, programs themselves are capable of flying aircraft, guiding surgeons, and building cars. Surely if such software can influence reality, then it itself must be accorded some real status in relation to other elements of reality. But if so, what should that ontological status be?

Back to the top.


The Cracks Deepen

Biology

Biology presents a serious challenge to the objectivist view of reality and to the Newtonian approach to understanding. Traditional science depends upon the ability to repeat observations under identical conditions in order to acquire a statistical confidence in the results. It also depends upon the presence of universal features, common to a large collection of subjects, which can be extracted through statistical analysis. Biology however deals with individuals, each having a unique development, a unique internal structure, and a historical dependency which renders it difficult, if not impossible, to accurately reproduce observational conditions. Moreover, current thinking in evolutionary biology suggests that a strong role has been played by historical contingency in the shaping of species, and contingencies are not easily dealt with in the standard scientific paradigms. In addition, as the artificial life researchers have pointed out, we have only one example of an evolutionary system to study, and one sample does not permit the kind of statistical inference required traditionally.

Aside from the unavoidable contribution of individuality, there is also a fundamental contextuality within biology. Cohen and Stewart (1994) provide abundant evidence for the contextuality of living systems, whether in the course of evolution, development, or of normal physiological processes and behavior.

The issues described above present primarily methodological and pragmatic obstructions to the pursuit of traditional science. However, biological systems pose deep ontological problems as well. The most fundamental involve the concept of identity and the establishment of an appropriate frame of reference. Living systems, unlike natural objects, arise through a process of development, in which a complex series of morphological and constitutive changes take place. The fundamental elements and molecules of any living system are constantly being replaced and recycled. Living systems ingest nutrients from outside of their bodies, and process these, incorporating some into their bodies while excreting others. They also shed portions of their bodies regularly. As a consequence of these continual biotransformations, a biological system cannot simply be identified with its constituent components. These components remain in a perpetual state of flux and so the concept of identity for a biological system cannot be su pported at this level of description. Instead, one might argue that it is at the level of morphological form that the concept of identity acquires meaning. But this too proves false. Biological systems develop from primitive egg states, comprising single cells, and pass through a complex sequence of morphological changes until the final form is achieved. Even then, most biological systems continue to age, resulting in subtle changes in surface structure and physiological function until ultimately they die. In some species, these morphological transformations are so dramatic that without some prior knowledge of the continuity of the developmental process, one would conclude that one was observing completely different species. Even within single species, the morphological differences between males and females can be so great as to lead to serious doubts about their relationship to one another. So morphology fails to provide an adequate basis for grounding a concept of identity. At this point one might believe that identity has no meaning for biological systems. Yet this flies in the face of direct personal experience, not only of one's own identity, but also that of one's family, friends, and acquaintances, not to mention one's pets and livestock. The basic identity of a biological system transcends both structure and form, and in so doing establishes a clear break from classical objects, whose identity is inextricably linked to both structure and form.

Note that I am not discussing the thorny problem of the epistemology of identity, that is, the difficulties inherent in the attempt to verify the identity of a given biological system. I am referring instead to the basic ontological issue of the enduring continuity of a given biological system over time.

Psychology

I began this paper with reference to psychology, in the form of Jung, and I return to it again here. The beginning of the Twentieth Century witnessed a rebirth of interest in the study of mind, and the creation of the science of psychology out of its philosophical forebears. Wundt, Fechtner, and Helmholtz placed psychophysics on a firm empirical foundation. Freud, Jung, Adler and the early pioneers of psychoanalysis reestablished interest in the dynamics of behavior and in the interior life of individual subjects. Bleuler and Kraeplin gave respectability to the study of psychopathology and treatment of the mentally ill. Psychology began with phenomenology and vitality, but retreated under the onslaught of objectivist doctrines into the pallid formulation which became behaviorism, rejecting mind for the mechanism of the conditioned response. Psychology languished for decades, until the birth of cognitive psychology in the 1950's. Unfortunately psychology remained unable to shake its attachment to objectivis t ideas, and cognitive science failed to live up to its promise of restoring mind to psychology by replacing mind with algorithm. Today, psychology continues to avoid dealing with the issue of mind, declaring it to be an empty concept, mere epiphenomenon, and focusing again upon mechanism, this time based in neurophysiology.

The clinical sciences have long continued to embrace the concept of mind but evidentiary traditions have remained rooted in the single case study and philosophical argument. As a result there have arisen a plethora of competing paradigms, each claiming to be the one true paradigm, and each claiming to have the proper approach to psychotherapeutics. Clinical experience has shown however that no single paradigm has a monopoly upon understanding or efficacy. Instead one finds that different paradigms must be tailored to individual cases, providing excellent agreement with observation and good efficacy in some cases, and not in others. This recognition led to the establishment of the eclectic approach to psychotherapy. No single paradigm provides an adequate understanding of all cases. This fact is seldom acknowledged in the clinical literature. Instead, the lack of proper fit between theory and observation has been labelled as resistance and interpreted negatively within the framework of the original theory. The theory is not corrected, the patient is. This slavish adherence to doctrine over evidence has been a major stumbling block to the proper development of a scientific clinical psychology. The schism which has developed between clinical and academic psychology has left clinical psychology without a firm scientific foundation, and academic psychology without relevance Yet, the very diversity of theories and psychotherapies provides an important clue as to the nature of the objects of psychological inquiry.

Each competing psychological theory constitutes an ontological frame of reference, possessing identified objects of inquiry, and a consistent dynamics which governs their interactions. Each such frame describes individual subjects to varying degrees, sometimes with remarkable accuracy and predictability, and sometimes totally at odds with observation. These frames appear to be irreducible with respect to one another. Nevertheless it is important to appreciate that subjects can be described and understood from within these competing frames. It is important however to appreciate that the dynamics of subjects resonate with the dynamics of these frames to differing degrees. Metaphorically speaking, it is as if these ontological frames of reference provide a collection of dynamical dimensions, and the degree of resonance with the frames places the subject at different points along each dimension, thus situating the subject in an ontological space as it were. Thus we again find evidence for the existence of mult iple, irreducible ontological frames of reference describing a single entity.

Mathematics

Finally let me turn briefly to mathematics. I have stated before that the characteristics of an object render it accessible to a description in set theoretical terms. A set is the prototypical object, being persistent, context independent, observer independent, and possessing intrinsic properties. The application of sets to the description of the various situations described above is deeply problematic. Consider a biological system. The usual approach is to identify a system with its collection of possible states and then to follow the course of the system over time as a trajectory within the set of states. But we have seen that a biological system cannot be identified with its set of states, whether structurally or morphologically defined. These change with each passage of time, and as a result the state space itself changes over time. While not impossible, a description of a system in terms of a continually changing state space is cumbersome, inelegant, and non parsimonious. It is reminiscent of the prob lems associated with Ptolemaic epicyles.

Set theory also suffers as an enduring and fundamental foundation for mathematics. The discovery by Godel of the incompleteness theorem demonstrated that logic alone was incapable of generating all of mathematics. He showed that for any consistent set of axioms which could express arithmetic, there existed at least one true statement which could not be deduced from these axioms using the rules of logic. As if this were not sufficient bad news, other logicians such as Cohen discovered that there was not one theory of sets, but an infinitude of theories of sets. Moreover he demonstrated that these theories were all logically independent of one another, meaning that given any two such theories, there were statements provable by one theory which were not provable by the other. This discovery demonstrated that there is not one mathematics; but an infinite number of mathematics. The symbolic method, paradoxically, is too constrained to express all that is true, and too open to express only truths about our real ity.

The Twentieth Century has thus left us with a recognition that there are fundamental aspects to reality which are completely non objective in their characteristics. They are transient, contextually dependent, observer dependent, co-creators rather than possessors of properties, and describable by multiple irreducible ontological frames of reference. The objectivist ontology must be supplanted by a new ontology which deals directly with these new characteristics. Furthermore, the mathematical description of these new aspects of reality demand a representation which stands outside of the usual set theoretic framework. One proposal for such a proper foundation, based upon the concept of the entity, is presented in the next section, followed by a discussion of a new conceptual basis for a mathematics of entities. It is argued that this new conception embraces the ontological issues raised by the above discussion, and provides a foundation for a non objectified interpretation of reality.

Back to the top.


Ontology Revitalized

Entities, Properties and Frames of Reference

It should be apparent from the discussion above that the concept of the object, and of being objective, is unduly limiting. Although there are indeed abundant situations which can be fairly and effectively described in objective terms, there are many more which cannot. These latter situations include many which we must deal with intimately on a daily basis. Much of biology, psychology, and computer science involve dealing with situations which fall outside of the usual limits of the objective. Having elevated the objectification of reality to the point of dogma, science has progressed much more slowly in these fields than might otherwise have been the case had it had available to it a more appropriate ontological foundation. Scientists may devalue the worth of these fields as domains suitable for scientific inquiry, but they cannot deny the pragmatic importance of these to our everyday lives. The standards of physics do not apply to these fields since their ontological structur es are distinct, and dramatically different. Indeed, as once suggested by Rosen (1987), physics truly can be considered as a highly specialized theory with much less applicability than orignally thought. The programme for the objectification of reality sought to replace a living, vital reality with one whose essential elements were comprised of objects: persistent, observer independent, passive, energetic, and capable of apprehension through a single ontological frame of reference. The examples above point to a reality which is impersistent, irreducible, interactive, nondeterministic, contextual, and many hued, admitting multiple independent frames of reference. Information rather than energy is the essential currency which shapes reality. Information, and with it, meaning and intention. Our reality of objects must now be expanded to include a reality with entities.

Objects reflect the most syntactic aspects of reality which in turn accounts for the success of their description through purely syntactic applications of mathematics, logic, and language. But the reality alluded to above does not lend itself to a purely syntactic description. The behavior of entities depends upon context and meaning and these these cannot be captured by syntax alone. A description which incorporates semantic aspects is required, though how to achieve this formally still eludes us. I have chosen to use the term entity to refer to these non objective aspects of reality precisely because the term carries with it a connotation of vitality, of intentionality, and of change. The central feature which distinguishes entities from objects is the role of information. In the two slit experiment the outcome depends crucially upon nature of the information being generated, or potentially generated within the experimental apparatus. In the EPR experiments, the b onds which allow one to treat the correlated particles as a single entity are informational, not material. Computer science deals almost exclusively with the consequence of information flow within a complex system, and the coupling of this information flow through transducers to the external environment provides this information flow with a causal efficacy. Information and meaning are essential to the understanding of mental phenomena and an entity based ontology is essential if psychology is ever to return to its original goal, the study of mind.

Most processes can be considered as entities, but not all entities are processes. Indeed, many entities do possess some material form and attributes. However, such entities not defined by such forms and attributes. Their components exist in a state of continual flux although the functions expressed by these components, their roles as it were, persist. These components behave like actors on a stage, where the play's the thing. Entities are like ontological plays waiting for actors to flesh them out, to make them come alive, to give substance to them. Entities possess coherence, a dynamical, a semantic consistency which holds them together both physically and ontologically within a space of meaning. Neither localization nor persistence are features of entities.

Entities appear to share two fundamental attributes in common. First of all, their internal coherence gives rise to a definable identity. Entities exist and possess an extensionality across some space of attributes. Their coherence ensures that this extensionality is united into an irreducible whole, a common sense, a common meaning. This is true regardless of whether the entity might, in some other frame of reference, be broken down into lesser units. The patterns of the game of Life possess a meaningful wholeness which transcends their fragmentation into the individual cells of the automaton. A living creature remains a coherent whole in spite of being broken down conceptually into organs or cells. Furthermore, this unitary wholeness, this identity, persists in spite of the myriad changes observed from other frames of reference. Identity is a primitive concept, akin to the concept of membership which is embodied in the concept of the set. Id entity too is embodied in the concept of the thread, to be described in the final section. Objects too possess a definable identity but it is a much simpler construct, being based on the constancy of some properties or attributes. The question of whether we can actually attribute different perceived aspects to a single entity, especially under conditions in which our perception across time is interrupted in some manner, is a deep epistemological question which does not concern me in this paper. One can imagine the enduring identity of an entity regardless of whether or not we are capable of perceiving and denoting such identity. It is that intrinsic enduring identity which does concern me here.

Second, all entities admit multiple irreducible ontological frames of reference. That is. all entities have the characterized that they can be described, understood, modeled, from multiple frame of reference, from multipl positions of meaning and interpretation, from multiple scales, and that some of the frames provide complete descriptions in spite of being mutually irreducible. In other words, they admit multiple interpretations which are not simply transformations or reinterpretations of one another. There is no single priviledged frame of reference from which to observe an entity. Instead there are, by necessity, multiple, equally valid, and distinct frames of reference. Entities constitute epistemological towers of Babel.

Each frame of reference, quantum or classical, macroscopic or microscopic, shadow archetype or anima archetype, provides a self contained description of the system. Each frame of reference provides a veneer of meaning through which the entity is to be interpreted and which is sufficient to define itself without recourse to external referencing. The frame of reference identifies the stuff which is to be apprehended, the relevant properties, and the qualities to be attached to those properties. It provides a complete spectrum of these qualities, and a medium through which they may be expressed to and shared by others. Some frames of reference may be unique to a particular observer, or they may be universal in nature. For example, archetypal frames have a local character, while the quantum and classical frames appear to be universal.

These ontological frames of reference should not be identified with their namesakes in mathematics, though an ontological frame of reference may have a symbolic expression in a mathematical frame of reference. An ontological frame of reference provides a parsing of reality into meaningful components, and it is meaning which holds it together and provides it with its power and significance.

It is important to emphasize that although these frames of reference are irreducible, they are not independent. The precise nature of this interrelatedness is an exquisite subtlety, to which justice cannot be done in a short paper such as this. In a word though, the same coherence which provides an entity with an enduring identity also binds these frames of reference together into a unitary whole. It is absolutely essential that these frames of reference cohere. Coherence provides a second primitive concept, and it too has an embodiment in the concept of the tapestry which will also be defined in the final section. In essence by coherence I mean that there will be links established between these ontological frames and that these links will preserve the meanings which are attached to each frame.

To illustrate the concept let me present a metaphor drawn from another field in which coherence is fundamental, that of the study of narratives. In a story, the actions that take place must remain in keeping with the attributes previously given. It would not make sense for example to say in one place "The small thatched cottage had but a single entrance facing east", and then in a later passage to say "The young princess entered the cottage, shivering, as the cold north wind blasted her back". The various descriptions must be linked in such a manner as to maintain a coherent overall pattern of meaning even if they need not be consistent in the strict logical sense. At the very least we cannot have a situation in which it is implied that an entity both possesses and does not possess a particular attribute simultaneously.

Based upon this extended preamble, I can now proceed to define three classes of models for further study. Objects have the characteristics alluded to above but in addition have the characteristic that all complete ontological frames of reference are reducible to a single form. All complete ontological frames of reference are (not necessarily reversible) transformations of one another. No new information is provided by a new frame of reference, merely a different vantage point. This essential defining characteristic of an object lies at the heart of its being subject to reductionism, since the process of reduction is just a transformation from one frame of reference into another in which the parsing of reality can be construed as being finer, more detailed, the components smaller, simpler. Objects have a monadic dynamics.

An emergent is characterized by having at least two complete ontological frames of reference such that only one may be transformed into the other. The dynamics which drives an emergent is caled protoarchetypal, since it is a precursor form of archetypal dyanmics. The two dimensional cellular automata provide an example of such an emergent. Emergents provide a conceptual and ontological stepping stone between object and entity.

Finally an entity is defined by the characteristic that there exist at least two ontological frames of reference such that neither can be transformed into the other. The dynamics which governs entities is termed archetypal dynamical. The term archetypal is chosen because prototypical examples of entities and their dynamics include livng organisms, psychological systems, especially psychodynamical systems, economies, and societies

Entities are not determined by the properties and attributes, nor can they easily be described as the possessors or generators of these properties and attributes. These are not intrinsic or independent of the process of observation and measurement. Instead they are interactional in nature. The frames of reference, which must in some sense describe these properties and attributes must therefore include some reference to the nature of the constructor which generates the frame. Frames generated by interactions with objects have a different character than those constructed by interactions with other entities. The measurement problem in quantum mechanics is an example of the construction of a frame for an entity using an object as the constructor. The classical frame is firmly rooted in the ontology of objects. The quantum frame however is a construct, inferred to exist but not yet explicated, since it would require the use of quantum entities to provide it w ith the necessary self referential character to ensure its internal coherence and consistency. Such an undertaking has not yet be carried out. Ontological frames of reference provide descriptors for objects, emergents, and entities, but in and of themselves, they do not provide any information about the dynamics of these things, about the manner in which the properties and qualities so revealed interrelate. Although many objects, emergents, and entities, especially those which are conceived formally, will have a deterministic dynamical structure, the vast majority will be stochastic or nondeterministic in nature. It is usual to consider the dynamics to be a correlational structure which is imposed upon some fixed frame of reference. But the dynamics itself possess meaning and form. Indeed, though it has gone somewhat out of fashion with the advent of the energy representation, the concept of force once provided a useful frame of reference from which to interpret causal relationships in physics. But one must understand that the very notion of force requires a frame of reference for its interpretation, and that this notion also admits of multiple referencing just as the entities which it influences. As a result, forces are themselves entities. This implies, or at least suggests, that causal agencies are themselves entities and must be treated as such, with multiple frames of referencing. This suggests that protoarchetypal and archetypal dynamics may have an expression in the form of entities.

If dynamics are entities, then how are the dynamics and the entities which they govern related? This is a particularly deep question to which I can give only a partial answer at present. If we look again at the examples above, one observes that each individual frame of reference comes with its own particular dynamic, and that these dynamics interrelate just as much as the frames of reference do. If, as I have suggested, the dynamic as a whole is an entity, then it in turn will be described by multiple ontological frames of reference. In particular, it should be possible to establish a linkage between the various frames describing the entity, and those which describe the dynamics. This linkage then becomes the expression of the dynamic. This linkage need not, indeed should not, be construed as a set valued mapping from some formal frame of reference to another. The character of this linkage will be explored in the next section, in which a mathematical representation of this situation will be attempted. At the very least however it should preserve the coherences inherent in the dynamic and the entity. The entity is not driven by the dynamic, but rather couples to it. In some cases, the coupling is very strong, perhaps complete and the entity then possesses a unitary dynamic. One can speak of the entity resonating with the dynamic. Such is generally the case for objects. In other cases the coupling may be quite weak, and we can speak only of the entity being influenced by the dynamic. Archetypes influence the psyche in just such a manner, with individuals couling to different archetypes to differing degrees at different times throughout the course of their lives. In is this which gives rise to such doctrinal confusion and rancor in the clinical sciences. Entities will frequently couple to many different archetypal dynamics and these coulings will change over time. Understanding resonances becomes an important task of analyzing the entity - dynamic relationship. One can understand the so-called collapse of the wave function as simply being a situation in which the dynamics of a quantum entity becomes coupled to, or resonates with, a classical entity or dynamic.

The character of the dynamics depends upon the coupling of entity to driving entity - a tight coupling gives rise to a deterministic dynamics, a weak couling to a stochastic dynamics. Most natural entites are stochastic.

Threads, Tapestries, and Links

In this final section, let me attempt to outline a formal theory with which to capture the notions discussed above. Just as the concept of set provides a primitve concept for the description of objects by realizing the concept of membership, so the concept of the thread provides a primitive concept for the description of entities by realizing the concept of identity. The term thread was chosen for two reasons. First, that material object which we call a thread possesses a physical extension which is continuous in one dimension. This extension gives rise to the concept of its length, and it provides an enduring visual image of a continuous structure which, if it were laid out in a 3-sapce coordinate system, would provide a representation of a trajectory of some system, that is, a representation of the continuous extension of some system over time. Second, the use of the term pays homage to the science fiction writer Orson Scott Card, who first used the term, in a similar causal sense as I do, in his series of Ender Wiggins novels (Card, 1996). There, Card explains the origins of the vitality and soulfulness of living systems as being a consequence of the existence of more primitive agencies called philotic threads which manifest themselves in our normal space-time as living creatures. The synchronicity which is such a vital aspect of our psychological experience arises out of the twining of these philotic threads. His work provides a nice image for motivating what is meant here by a thread. In essence, the concept of a thread captures the fundamental, enduring continuity and coherence of an entity across time and space.

A physical thread can be thought of as binding together different regions of space into a single object, by virtue of its continuity is space. By analogy, our threads bind together different states, behaviors, configurations, forms, or properties of an entity into a single whole, again by virtue of continuity. Here though, continuity is to be understood at the level of identity, reflecting that all of these aspects are of a single ontologically coherent and persistent unit, a single whole. In order to understand the concept of thread, one must think of (an) identity as having an ontological status of its own, which is independent of those particular attributes and properties which any specific manifestation might possess. Entities have been seen to possess enduring identities which are independent of their particular properties, and these properties help us to distinguish disitnct identities epistemologically, but in and of themselves they do not serve to define each individual identity. It requires an awa reness of the totality of each entity, both spatially and temporally, in order to assert that any given entity possesses a specific, identifiable identity. Nevertheless it still possesses an identity regardless of our ability to distinguish and label it as such. It is that deep, intrinsic aspect of identity which is the fundamental characteristic of a thread.

In the modeling of objects, one can use the independence of properties so as to provide them with an independent ontological status. In so doing one can represent an object as moving about within some preexisting space of possibilities. Identity becomes equated with trajectory in this case. As we have seen however, entities are co-creators of properties and there is no predetermined space of possibilities since what arises depends upon local contingencies. One would need a space of all spaces in order to adequately describe this siutation, and such an approach beocmes not only unwieldly, but subject to the paradoxes which eventually forced set theory to extend into the theory of classes. But in the case of entities our problem would not be solved, since one would now require a class of all clases in order to preserve the classical approach to identity. Clearly one must either abandon the notion entirely, or adopt a different approach. Instead of attempting to encompass all aspects of an entity into a singl e frame, let us accept that entities create their spaces of possibilities as they go, and that the only aspects which are certain for an entity are those which have already been made manifest, and those which in the present are becoming manifest. An entity is a fundamental expression of the arrow of time which is so belittled by physicists, but which cannot be avoided here since entities possess such a deeply entrenched historical dependency. An entity extends itself from past into future by means of the present, just as it may extend itself from occupied space into unoccupied space across its boundary. Implicit in this conception are two fundamental operatons: the extension of the entity from present into future, and the examination of previosuly manifest past. Epistemologically speaking, of course we have no actual record of this manifest past, apart from partial recordings of selected aspects and the indirect effect that the past hasupon memory and behavior through learning. Neverthelss at the very least, one can argue that the past was made manifest at one time, and that, in principle, such a detailed recording could be made which could be later examined. It is this potentiality which is captured in the notion of examination.

Thus we come to operationalize the concept of thread by introducing two operations: the operation of extension, E, which extends a thread i by a fragment of thread i', while retaining the identity of the original thread i. This process we denote as i->E(i,i') and a second operation e, examination, which assign to each thread i either a thread fragment or a null thread, n. This process we define as e(i)=i'. To each extension operator E there corresponds and examination operator e so that if i->E(i,i') then e(i)=i', otherwise, e(i)=n. This is a minimal consistency criterion which preserves causality.

In the material case, threads can be woven together to create fabrics and intricately patterned items called tapestries. Entites too generally consist of great numbers of lesser entities (or even objects) which come together in myriad complex, and often transient, ways. For an example, think again about the ebb and flow of substrate and substance in a living organism. These lesser entities transiently couple their dynamics to one another. Thus it is reasonable to conjecture that our threads should also couple in some fashion to one another.

There exists another operation which acts not upon single threads but rather upon collections of threads. This operation, W, termed weaving, creates complex patterns of threads, called tapestries. The weaving operation is also a partial extension operator for tapestries. I say partial, because it can take as its domain of operation any suitable subcollection of threads. Given threads i1,i2,i3, ..... we may write the weave as W(i1,i2,i3,...).

To each weaving operation one again has an examination operator, w. To each examination operator w, there corresponds a thread examination operator e. We relate the thread and tapestry examination operators as follows: W(i1,i2,i3,...)=W(e(i1,ei2,ei3,...). This means that examining a weave is the same as weaving the examined fragments. This again constitutes a consistency criterion.

A thread coresponds to an identifiable ontological construct, a tapestry to a collection of interacting constructs. A tapestry, together with its associated threads and operators constitutes an ontological frame of reference. An entity is descriable using multiple frmes of reference, and thus there are multiple tapestries which describe an entity. I previously stated that these multiple frames are mutually irreducible, but are not entirely independent. These internal dependencies give rise to what I call links between tapestries. Links can be thought of as nonextensional threads which form weaves between distinct tapestries, reflecting and preserving deep causal interconnections between the two tapestries and providing a reference through which the concepts of coherence and, consistency may be operationalized. Given two tapestries T1,T2, two examination operators w1,w2, with corresponding thread examination operators e1,e2, threads i1< /SUB>,i2,... in T1 and threads i'1,i'2,... in T2, a link L(T1,T2) is a special weaving L of the form L(w1T1,w2T2)=L(e1i1,e1i2,...;e2i'1,e2i'2,...). That is, the link weaves together or relates thread fragments at one extension with those at another. This provides a causal linkage between these two tapestries.

The present work is preliminary, and is meant to suggest that the ontological analyis given above is not vacuous but offers a point of departure for the establishment of a new mathematics, better suited for the study of those complex systems which regularly appear in psychology, psychiatry and the life and social sciences.

References

  1. Leeming, D.A., Leeming, M.A. (1994) Encyclopedia of Creation Myths. Santa Barbara: ABC-CLIO.
  2. Card, Orson Scott. (1996) Children of the Mind. New York: TOR Books.
  3. Cohen, J., Stewart, I. (1994) The Collapse of Chaos. London: Penquin.
  4. Crutchfield, J.P., Hanson, J.E. (1993) Turbulent Pattern Bases for Cellular Automata. Santa Fe Institute Preprint 93-03-010.
  5. Einstein, A., Podolsky, B., Rosen, N. (1935) Phys. Rev. 47, 777.
  6. Fox, R.F., Elston, T.C. (1993) Amplification of fluctuations by the Lorentz equations. Chaos. 3(3), 313-323.
  7. Goldstein, S. (1998a) Quantum Theory without Observers - Part 1. Physics Today 51 (3) 42-47.
  8. Goldstein, S. (1998b) Quantum Theory without Observers - Part 2. Physics Today 51 (4) 38-42.
  9. Grimal, P. (1965) Larousse World Mythology. New York: Prometheus Press.
  10. Haroche, S. (1998) Entanglement, Decoherence, and the Classical/Quantum Boundary. Physics Today 51 (7) 36-42.
  11. Jung, C.G. (1935) Archetypes and the Collective Unconscious. Collected Works, Vol. 9. p. 54. Princeton : Bollingen.
  12. Jung, C.G. (1955) Mysterium Coniunctionis. Collected Works, Vol. Princeton: Bollingen.
  13. Pearson, C.S. (1991) Awakening the Heroes Within. New Harper Collins.
  14. Rosen, R. (1987) Some epistemological issues in physics and biology. In B.J.Hiley and F.D.Peat, Quantum Implications: Essays in Honor of David Bohm. London: Routledge.
  15. Smith, B.C. (1998) The Origin of Objects. Cambridge: The MIT Press.
  16. Sulis, W. (1996) Towards a Formal Theory of Collective Intelligence. Poster presented at Artificial Life V, Kyoto, Japan.
  17. Sulis, W. (1997) Collective Intelligence as a Model for the Collective Unconscious. Psychological Perspectives. 35, 64-91.
  18. Sulis, W. (1994) Naturally Occurring Computational Systems. World Futures, 39, 225-241.

Back to the top.


| Discussion Board | Previous Page | Your Symposium |
Sulis, W; (1998). Archetypal Dynamics. Presented at INABIS '98 - 5th Internet World Congress on Biomedical Sciences at McMaster University, Canada, Dec 7-16th. Invited Symposium. Available at URL http://www.mcmaster.ca/inabis98/sulis/sulis0731/index.html
© 1998 Author(s) Hold Copyright