Home » Library » Consciousness » Consciousness Reassessed
Categories:

Consciousness Reassessed

Published in Mind and Matter 2004.

I gratefully acknowledge the most helpful critical evaluations of earlier drafts by Professors Patrick Heelan SJ and Harald Atmanspacher, as well as an unnamed reader. Dr. Karen Shanor used her expert editing skills to eliminate ambiguities and to make the manuscript understandable.

There used to be a guide to the famous maze at Hampton Court that showed the quickest route to take. Nobody who used it ever reached the center, which lies not in the unraveled, but in the unraveling. 

John Fowles, Islands 1978, p319

Introduction

Many sophisticated essays and books have recently been written about the topic of ‘consciousness.’ My own contributions date back some 25 years in an essay entitled ‘Problems concerning the structure of consciousness’ (Pribram 1976), and five years before that in delineating the difference between brain processes that are coordinate with awareness and those that are coordinate with habitual behavior (Pribram 1971). I have been intrigued by what has been written since and take this occasion to reassess a few of the major issues that have arisen.

The reassessment focuses on the ‘how’ of mind/brain transactions. These are currently subsumed under the headings 1st person and 3rd person viewpoints. Definitions of consciousness such as those used in medicine (e.g. coma, stupor, sleep and wakefulness) come obviously from a 3rd person vantage. Also the various studies on ‘theories of mind’ that infer that we understand one another take primarily a third person stance. Other third person approaches include descriptions of attention, intention and thought that are discussed in some detail in this essay. The current attempt is to approach the issue of conscious experience from a first person perspective as fleshed out by the ‘how’ of third person research.

The Primacy of Conscious Experience

For each of us all inquiry, and therefore knowing, begins with our own experience. This experience becomes ‘conscious’when it becomes accessible to being monitored. Monitoring can be proactive or retrospective. The experience develops as interpretive transactions occur between our genetic heritage and its biological, social, and cultural context. Acknowledging this primacy is relevant to many of the issues that are currently so hotly debated. For example, by approaching the mind/brain relation in a top-down fashion, the brain processes are not seen as homunculi, little people inside the head. Instead, brain processes become understood as some, but not all, of the organizing influences that compose our conscious experience. Understanding develops as we explore these experiences. Understanding at any moment is hermeneutic and therefore partial, as is most scientific understanding.

My version of a top-down approach to understanding conscious experience is hostile to an eliminatist, totally reductionist, materialist stance (Churchland P.M. (1995); Churchland P.S. (1986); Crick 1994) whether that stance is epistemic or ontic. In contrast, my view respects each scale of inquiry on its own level. There is, however, tolerance for a weak form of reduction: At the boundaries between scales identities in the form of transformation rules, translations between languages are sought. Understanding at each scale is experienced as a process in its own right, without losing sight of the interrelations and transformations that encompass the whole (Pribram 1971; 1998; King and Pribram 1995; Pribram and Bradley 1998).

A cathedral may be made of bricks, but understanding the cathedralness of the structure is not limited to understanding bricks (nor even understanding the importance of buttresses that allow Gothic soaring). We may smash matter into particles but understanding the material universe is not limited to understanding particles. We note drops of water as they drip from a faucet and can make quantitative measures on drops. But where are the drops once they are gathered in a bucket? Do our measurements on drops per se, account for the crystalline structures of snowflakes?

Having proclaimed this caveat, I want to add: but isn’t it fascinating and exciting to know of all these bits and pieces, how they’re put together, and to what purpose! That is the story of exploration of our conscious experience.

For example: I experience the color red. I note that people stop at red traffic lights. Perhaps they have had similar experiences to mine. In another context and at a different time I learned that, in terms of physics, the color red is produced by a specific bandwidth of a spectrum of electromagnetic radiations. But I also found out that this experience is specific to a particular context of illumination. For instance, I see the same colored objects as having a different color under ultraviolet illumination than under ordinary light.

Further along in my career, I studied color vision in preparation for my medical school thesis, and was surprised to find that the central part of the retina, the fovea that we ordinarily use in daytime to see patterns, had very few (somewhere around 7%) receptors for the wavelengths of the spectrum that we identify as blue. Our color vision seemed to be composed either by receptors sensitive to only two types of ‘color’ receptors (Wilmer, 1946) or one ‘color’ receptor and white Land, 1986). The primary sensitivities are combined at further processing levels into opponent types of sensitivities (e.g. blue and yellow and red and green). But opponent processing had to be rooted in ‘something blue’ and the root was feeble.

Not until recently has a sophisticated brain model of color vision been composed. Russell and Karen DeValois based the model on all the earlier work and on their own and others’ more recent psychophysical and neurophysiological data. (DeValois and DeValois 2003). They used a low frequency (red) and a medium frequency (yellow) as the ‘primary’ types of receptors and brought in the higher frequency (blue) as modulators. The model also accounts within the same neuronal ‘network’ for the reception of black/white necessary for the perception of shape. For me, the successful outcome of these decades of detailed experimentation and theoretical attempts at solving a mostly ignored physiological observation has been a heart-warming experience.

In an entirely different realm of inquiry, I heard that cultures differ in the number and sort of color the people in those cultures can share with others. This is a good example of how my experiences become meshed with those of others. In the 1960s, nomads in northern Somalia were unable to distinguish red from green. Nor could they distinguish red from yellow or black in ordinary circumstances. In their semi-desert environment red was rarely if ever experienced. But they distinguished many shades of green and had names for these shades. Peace Corps volunteers were unable to differentiate between these many shades. (Karen Shanor, personal communication).

Interestingly, some Somalis could distinguish colors such as red, orange and purple: They were tailors and merchants who dealt with colored fabrics. In short, they had been trained to perceive. The question arises as to whether these people experience the variety of colors prior to training. I have a personal story that sheds some light on this issue: When two colleagues and I began to study the anatomical composition of the thalamus of the brain, all we could make out was an undifferentiated set of stained cells. One of us complained that the thalamus looked like a bunch of polka dots. After months of peering down a microscope and comparing what we were looking at with atlases and published papers, the differentiation of various nuclear structures within the thalamus became obvious to us. We had reached ‘inter-subjectivity’which, in other contexts, has been referred to under the heading ‘theory of mind.’ Continued study and experimentation over several years enabled us to publish substantive contributions to the organization of thalamic organization and, additionally, to the role this organization plays in its connections to the brain cortex. On the basis of these findings I was able to distinguish a difference in organization between the posterior convexity of the brain and that of its frontolimbic formations, excursions into what Moghaddam (2003) calls inter-objectivity.

In a ground-breaking set of experiments, James and Eleanor (Jackie) Gibson (1955) showed that perceptual learning consisted of progressive differentiation, not enrichment through association. My conclusion is that the cultures that do not communicate a rich diversity of colors have the capacity to do so but do not actually experience that richness until they learn to do so.

To summarize: Science and personal observation have discovered a great deal about conscious experience of color: some physics, some biology, some brain science, some social and cultural facts. I believe that by taking into account these observations and experimental results as well as showing their limitations and contextual constraints, we can say that we have some ‘understanding’ of (standing under) our conscious experience of the color red as one facet of the world within which we live and act. Contrary to the overambitious pronouncements of some (or is it many) current philosophers of science, scientists gratefully search for such partial understandings: Conscious experience itself is the starting point, not the end of knowing and understanding.

The Privacy of Conscious Experience?

Many philosophers of science currently contend that one of the most intractable problems in studying consciousness is that my consciousness (now often referred to as 1st person consciousness) is for all practical purposes inaccessible to others (now often referred to as 3rd person consciousness). This issue has been called one of the hard problems dogging scientific understanding of conscious experience. But if one’s conscious experience is the starting point not the end of inquiry, we come to realize that in fact, we are very good at communicating our personal, 1st person, conscious experience to others and to ourselves. The communication can be verbal or non-verbal.

Take the reflex of withdrawing one’s hand from a hot flame or pot (René Descartes’ example). It is possible by repeating the behavior to condition the response so that withdrawal would take place before the pot is touched (a fractional anticipatory reaction in stimulus-response psychological theory). But by becoming aware of the withdrawal and the hotness of the pot, not only can we abbreviate the process of not touching hot pots but, in addition, we can transmit what we know to our children and roommate (who might be an absent minded professor). Consciousness is what it says: ‘con-sciousness,’ to know together.

My claim is that the problem of communicating to ourselves and to others as to what constitutes my personal experience is not that different in kind from experiencing observations in the physical sciences. Watching an oscilloscope screen that supposedly tells me what is going on within an atom or viewing through a telescope the images that appear to relate to the happenings in the stellar universe are fraught with uncertainties and subject to the contexts (e.g. the devised instruments) by which the observations are made. Ernst Mach (1890) whose father was an astronomer, based his whole career on trying to distinguish the ‘subjective’ aspects of physical observations. His legacy, (Mach bands: see Ratliff 1965) attests to the importance of the ‘psych’ in psychophysics: a grey spinning wheel that is continuously varying in shade from center to periphery looks to us as if the shading were discontinuous and banded. Even more dramatic, David Bohm (1973) pointed out that should we observe the universe without lenses it would appear as a hologram. As a neuroscientist, I noted that the same consideration must apply to the role of the lenses of the eye and the lens-like structures of other sensory receptors (Pribram 2004). Without these receptor organs we would experience the world we live in as a hologram.

We observe and communicate with others, we develop tools for more acute observation, and we formulate the results to receive consensual validation that we are on the right track. What is really private are the unconscious processes to which we have such limited access. Sigmund Freud’s contribution (see, for instance Pribram and Gill,1976; and Pribram 2003; 2004) was to attempt a technique by which we could access these unconscious processes and bring them into our conscious experience so that we could share them and do something about them.

Thus, through consciousness we become related to each other and to the biological and physical universe. Just as gravity relates material bodies, so consciousness relates sentient bodies. One can no more hope to find consciousness by digging into the brain than one can find gravity by digging into the earth. One can, however, find out how the brain helps organize our relatedness through consciousness, just as one can dig into the earth to find out how its composition influences the relatedness among physical objects by gravitational attraction.

MATTER AND MIND

Observables, Observations, and Measurement

Just what is the specific role of the brain in helping to organize our conscious relatedness? A historical approach helps sort out the issues. The Matter/Mind relationship has been formulated in terms of cuts. In the 17th century the initial cut was made by René Descartes (1972/1662) who argued for a basic difference in kind between the material substance composing the body and its brain and conscious processes such as thinking. With the advent of quantum physics in the 20th century Descartes’ cut became untenable. Werner Heisenberg (1927) noted a limit to simultaneously measuring a moment of inertia, the rotational momentum of a mass, and its location. Dennis Gabor (1946) found a similar limit to our measurement of a communication, that is, minding, because of a limitation in simultaneously measuring the spectral composition of the communication and its duration.

These indeterminacies place limits to our measurement of both matter and mind and thus the location of the matter/mind cut. Heisenberg (1927) and also Wigner (1967) argued that the cut should come between our conscious observations and the elusive ‘matter’ we are trying to observe. Niels Bohr (1959) argued more practically that the cut should come between the instruments of observations and the data that result from their use. (For sophisticated reviews of how these scientists viewed the results of their observations and measurements see Heelan, 1965; Stapp, 1997)

In keeping with Bohr’s view, these differences in interpretation come about as a consequence of differences in focus provided by instrumentation (telescopes, microscopes, atom smashers, and chemical analyzers). Measurements made with these instruments render a synopsis of aspects of our experience as we observe the world we live in.

The diagram below provides one summary of what these measurements indicate both at the quantum and cosmic scale. The diagram is based on a presentation made by Jeff Chew at a conference sponsored by a Buddhist enclave in the San Francisco Bay area. I had known about the Fourier transformation in terms of its role in holography. But I had never appreciated the Fourier-based fundamental conceptualizations portrayed below. I asked Chew where I might find more about this and he noted that he’d got it from his colleague Henry Stapp who in turn had obtained it from Dirac. (Eloise Carlton, a mathematician working with me, and I had had monthly meetings with Chew and Stapp for almost a decade and I am indebted to them and to David Bohm and Basil Hiley for guiding me through the labyrinth of quantum thinking.)

One way of interpreting the diagram is that it indicates matter to be an ‘ex-formation,’ an externalized (extruded, palpable, compacted) form of flux. By contrast, thinking and its communication (minding) are the consequence of an ‘internalized’ (neg-entropic) forming of flux, its in-formation. Flux, (or holoflux, Hiley, 1996) is here defined (see Pribram and Bradley, 1998) as representing change, measured as energy (the amount of actual or potential work involved in altering structural patterns) and inertia, (measured as moment, the rotational momentum of mass). David Bohm, 1973), had a similar concept in mind which he called a holomovement. He felt that my use of the term ‘flux’ had connotations for him that he did not want to buy into. I, on the other hand, felt holomovement (ordinarily a spacetime concept) to be vague and wrong since there is nothing being moved. We are dealing with spectra, distributions of energy and moments measured in terms of frequency (or spectral density). In the nervous system such distributions have been recorded as representing activity of dendritic receptive fields of neurons in sensory cortexes (Pribram 1991; King, Xie, Zheng, and Pribram, 2000; Pribram, Xie, Zheng, SantaMaria, Hovis, Shan, and King. accepted for publication).

The diagram has two axes, a top-down and a right-left. The top-down axis distinguishes change from inertia. Change is defined in terms of energy and entropy. Energy is measured as the amount of actual or potential work necessary to change a structured system and entropy is a measure of how efficiently that change is brought about. Inertia is defined as moment, the rotational momentum of mass. Location is indicated by its spatial coordinates.

The right left axis distinguishes between measurements made in the spectral domain and those made in spacetime. Spectra are composed of interference patterns where fluctuations intersect to reinforce or cancel. Holograms are examples of the spectral domain. I have called this pre-spacetime domain a potential reality because we navigate the actually experienced reality in spacetime.

The up-down axis relates mind to matter by way of sampling theory (Barrett 1993). Choices need to be made as to what aspect of matter we are to ‘attend.’ The brain systems coordinate with sampling have been delineated and brain systems that impose contextual constraints have been identified (Pribram 1959; 1971).

My claim is that the basis function from which both matter and mind are ‘formed’ is flux (measured as spectral density) that provides the ontological roots from which conscious experiences regarding matter (physical processes) as well as mind (psychological processes) become actualized in spacetime. To illuminate this claim, let me begin with a story I experienced: Once, Eugene Wigner remarked that in quantum physics we no longer have observables, (invariants) but only observations. Tongue in cheek I asked whether that meant that quantum physics is really psychology, expecting a gruff reply to my sassiness. Instead, Wigner beamed a happy smile of understanding and replied, ‘yes, yes, that’s exactly correct.’ If indeed one wants to take the reductive path, one ends up with psychology, not particles. In fact, it is a psychological process, mathematics, that describes the relationships that organize matter. In a non-trivial sense current physics is rooted in both matter and mind. (See Chapline 1999: ‘Is theoretical physics the same thing as mathematics?’).

Conversely, communication ordinarily occurs by way of a material medium. Bertrand Russell (1948) addressed the issue that the form of the medium is largely irrelevant to the form of the communication. In terms of today’s functionalism it is the communicated pattern that is of concern, not whether it is conveyed by a cell phone, a computer or a brain and human body. The medium is not the message. But not to be ignored is the fact that communication depends on being embodied, instantiated in some sort of material medium.

This convergence of matter on mind, and of mind on matter, gives credence to their common ontological root. (Pribram 1986; 1998). My claim is that this root, though constrained by measures in spacetime, needs a more fundamental order, a pre-spacetime potential that underlies and transcends spacetime. The spectral bases of the quantal nature of matter and of communication portray this claim.

Identity and Multiple Instantiations; Neither Reduction nor Multiple Aspects

Many of the problems that fuel the current discourse on consciousness are due to the acceptance of a radical reductionist stance. Take Francis Crick’s view that if we knew what every neuron is doing we would dispense with Folk Psychology. But what every neuron is doing is a complex process composed of synapto-dendritic fine fibered transactions, circuits, modules composed of circuits and systems composed of modules. The complexity of our experience can also be hierarchically organized into levels of organization, scales of processing that must be taken into account if we are to relate the organization of our experience to the organization of the brain (see for instance, King and Pribram, Eds. Scale in Conscious Experience 1995)

With regard to the complexity of neural organization, it is important to return to Russel’s (1948) caveat: there is a great deal that can be learnt about brain processes that is irrelevant to the transduction and modification of informational patterns that form our conscious awareness. Of course it is important to know how brain matter is constituted in order to prevent and to heal breakdown. But much of this knowledge does not contribute to the critical relationships that describe how brain processes contribute to the organization of mind. This is essentially the argument of the functionalists.

Memory storage is a case in point. Material scientists are needed to develop the best substrates for making CDs and DVDs. In earlier times the development of tape recordings went though several phases of finding a suitable recording material. Initially, recordings were stored on wire—I remember well the irritating tangle such wires got themselves into: a hardware Altzheimer’s disease. Imbedding the wires into plastic solved that problem. What is it in normal brains that prevents such tangles from occurring? Whatever it is, it is most likely not directly relevant to the code that is instantiated in the neural tissue, wire, tape or CD.

To repeat: The medium is NOT the message. The message becomes embedded in one or another limited but essential characteristic of the medium. It is up to brain scientists interested in behavior and in conscious experience to discern just what this characteristic is—to sort the wheat from the chaff.

With regard to the hierarchical complexity of experience, insights can be gained by taking computers and computer programs as metaphors. An identity in structure characterizes both the binary machine language (BITS) and the basic hardware operations of the computer. Octal and hexadecimal coding, is a condensed encoding scheme (into BYTES) by triplets and quadruplets. What is seldom recognized is that the size of a BYTE determines a minimal form of parallel processing (as this term is understood in the construction of massively parallel distributed processing (PDP) computational architectures). The change in the syntactic scheme from BITS to BYTES allows a change from a code where meaning resides in the arrangement of simple elements to a pattern where meaning resides in the structure of redundancy, that is, in the complexity of the pattern. In this scheme each non-redundant element conveys a unique meaning. (For details as to how such coding schemes become implemented in computer hardware and in the nervous system, See Pribram 1971, Languages of the Brain: Lecture 4, Codes and Their Transformations pp 66–74) A further set of hierarchically ordered programming languages leads to the capability for the input and output devices to address the computer in a natural language such as English. At the lower level of the hierarchy it is often useful to implement the software in hardware and vice versa. But for higher order programs this is infeasible.

At the level of natural language programming, a dualist philosopher might well point out that the material computer and the mental program partake of totally different but somehow interacting worlds. In fact, for several years, the hardware machinery could be patented, while copyrights protected the narrative-like high level programs.

With respect to brain processes and psychological processes, a fundamental identity is established by a Gabor-like elementary function (see Pribram 1991 for review). Dennis Gabor, a mathematician, developed his unit (a windowed Fourier transformation) to discern the maximum compressibility of a telephone message without losing intelligibility (1946). Beyond this maximum an indeterminacy holds, that is, the meaning of the message becomes uncertain.

Gabor (1954) related his measure of uncertainty to Shannon’s measure of the reduction of uncertainty, the BIT (binary digit) the basic unit of current computer information processing. Gabor used the same mathematics (a Hilbert phase space) as did Dirac to describe the microstructure of matter, so Gabor named his unit a quantum of information.

During the 1970s and thereafter, experiments in many laboratories including mine showed that the Gabor function provides a good description of the architecture of activity in cortical dendritic fields to sensory stimulation (DeValois and DeValois 1988; Pribram 1991). Thus the same mathematical formulation describes an elementary psychological process, communication, and an elementary material process in the brain.

The Gabor quantum of information can therefore serve the same function for the wetware/minding relationship that the BIT serves for the hardware/software relation. English is not spoken by the computer, nor are there photographs in the computer when it processes the takes of a digital camera. Likewise, there are no words or pictures in the brain, only circuits, chemical transactions and quantum-like holographic (holonomic) processes based on Gabor-like wavelets. To use another metaphor, the processing of an fMRI tomograph uses quantum holography. The pictures we see are reconstructions made possible by the process.

Not only radical materialists but also Identity theorists claim that neurological and psychological processes partake of sameness. I have stated something like this in the preceding paragraphs but limited the identity to the most elementary structural descriptions of brain and psychological processes. In higher, more complex organizations, meaning no longer resides in the arrangement of simple elements (the amount of information) but rather in the structure of redundancy. Complex organizations progressively depart from identity until they appear as duals far removed in structure and content from one another. Nonetheless at their roots, a structural identity such as the Gabor elementary function, a dendritic quantum of information, can be discerned.

This perspective leaves another issue unexamined: Not only bit processing but natural language word processing by computer is also instantiated in processing hardware. Does this indicate an identity between natural language and the hardware? In the case of the computer there is a hard drive consisting of a CD-like disc that can be addressed by running the program. Storage is location addressable and can be increased by supplemental CDs. In image processing by tomography ( such as fMRI) the medium permits a content addressable distributed process (quantum holography akin to Gabor’s windowed Fourier transform) to be instantiated. There must be some sort of ‘identity’ between the patterns inscribed in the grooves of the storage medium and the natural language but the transduction into those grooves and back out must be implemented (literally) by input and output devices that include the kind of programming steps already noted. To use Wheeler’s famous phrase ‘from IT to BIT’ (but also from BIT to IT) is not just a simple matching procedure.

Memory storage occurs in the brain and partakes of both distributed processing in the fine fibered arbors (dendrites) and content addressable processing in the brain’s local circuitry. I have termed these aspects of memory storage and retrieval ‘deep’ and ‘surface’ structures (Pribram 1997; Pribram and Bradley 1998). As in the case of the hardware metaphors there are input and output steps in processing that suggest caution in assigning an identity to the relation between the experienced and the brain patterns.

I will take a case in point: Cells in the temporal lobe of the brain of monkeys have been found to respond best (though not exclusively) to front views of hands. An identity (or materialist) theory might be overjoyed with such a result. Of course, as has been pointed out, for a neuroscientist, the question of ‘the how’ of hand recognition has simply been pushed back into a neuron from consideration of the whole person. I found such cells in monkeys who had been trained to choose a green stimulus in an earlier experiment. In these monkeys the best response was to green hands and green bars. In other laboratories Fourier decomposition of the stimulus showed that, in fact, the cells were responding to the Fourier descriptors of the stimulus (the tangent vectors to the outlines of the stimulus). To use a currently common expression: the point being, the brain process doesn’t look at all like what I am experiencing and from other studies, probably not what the monkey is experiencing.

I must belabor this issue because the eliminatist materialists might claim that all they are arguing for is that some form of explanation of experience in brain terms is called for and that when that explanation is complete we no longer need to use ‘Folk Psychology’ to describe or conscious experience. Next time you are in Times Square in New York, just note to your scientifically unwashed colleague: look at that beautiful Fourier transformed tangent descriptor outlining the opponent process of the limited spectrum to which we can respond. He thinks you are balmy and says it’s just the illuminated picture of a pretty girl. And he’s conveyed a good deal more than you have.

A final caveat: I have distinguished information from matter (exformation). Died in the wool (might I say wooly headed) materialists don’t buy into this distinction. For them, information is some sort of matter or its equivalent energy (see Pribram 2004 on the spatialization of energy in the term E=mc2). Functionalists tend to ignore the problem of the instantiation of informational patterns, a problem brain scientists cannot ignore. Hopefully, I have suggested a direction that allows unpacking these issues.

The main unanswered question for identity theory has been: ‘How does the identity come about?’ One answer has been that brain processes and psychological processes are different aspects of some more basic process. Linguistic philosophers termed this the difference as a difference between brain talk and mind talk. The problem then arises as to what is that untalked about basic process? (For an extensive and sophisticated philosophical discussion of the multiple aspects, though not the multiple instantiation, view that in many respects is close to mine see Velmans 200). My answer (1986) has been that the untalked about ontological basic process is identified as flux (describing measurements of energy and moment in terms of frequency. (Note that the mathematical use of frequency is neutral to time and space: in audition, frequency determines the pitch, time the duration of a tone).

My additional claim is that by identifying flux as their ontological basis function, brain organization and psychological organization become more than merely multiple aspects, multiple perspectives on some unspecified underlying order. Transformations of flux into spacetime coordinates, specify material and temporal locations of information and meaning. Thus, multiple aspects describe actual instantiations, embodiments, of an underlying order (Pribram, 1971a, 1971b; 1997). This statement is a more precise rendering of what is intuitively called “information processing” by the brain (Velmans 2000. See also, Gabora 2002 for an interesting and detailed perspective on how brain processes ‘amplify’ phenomenal information.)

BRAIN AND EXPERIENCE

Co-ordinate Structures Relating Brain to Conscious Experience

Science is most often an analytical process. The structural perspective presented here is both analytic and synthetic. For practical purposes a considerable amount of analysis has to precede structural synthesis. Within an analytical perspective, there are two disconnects that have to be dealt with: one is between the organization of conscious experience and the organization of behavior. The second disconnect is between the organization of conscious experience and the organization of brain processes.

With respect to the disconnect between the organization of conscious experience and behavior take the example of driving an automobile avoiding other cars, stopping at red traffic (but not other) red lights while engrossed in a conversation with a passenger.

With respect to the disconnect between the organization of conscious experience and the organization of brain processes, there are innumerable examples. For instance the organization of brain processes that organize vision, certain stages of the brain process operate in the spectral domain while our experience is in spacetime. This is no different from the operation of functional magnetic resonance imaging (fMRI) where the apparatus operates in the quantum holographic domain while the resulting pictures emerge in spacetime.

Another example is that my experience of self is unitary. However under certain conditions two rather different selves can be discerned: an objective ‘me,’ and a monitoring narrative ‘I’ (Pribram and Bradley 1998). When part of my body (or brain) is tampered with, for example my face after a dentist’s novocaine injection, or an injury to the right parietal cortex of the brain, an objective ‘me’ experiences a change in the contents, the objects, of the experience. My experience is about the imaged distortion of the (essentially unchanged) face and about the loss in one’s body image of an arm (which is perfectly intact and may, in fact, perform, that is, behave, normally).

But a rather different set of experiences constitute an ‘I,’ a narrative self composed of experienced episodes of events that are monitored by attention, intention, and thinking (see Pribram 1999, ‘Brain and the Composition of Conscious Experience’). Experienced episodes no longer become a part of the narrative ‘I’ after lesions of the frontolimbic formations of the brain.

As noted, the differences between experiencing the self as a ‘me’ and as an ‘I’ have been shown to be coordinate with, but not identical to, differences among the organization of brain processes. By coordinate I mean that there are transfer functions (codes, languages) that, at the boundaries between the experienced and the neural scales of organization, allow descriptions at one scale of organization to become related to the description of the other. The transfer functions that allow coordinate descriptions to be made between the organization of experiential, behavioral and brain processes have made up the chapters and lectures of my books and essays (for example Languages of the Brain 1971 ; Brain and Perception 1991; ‘Brain and the Composition of Conscious Experience’ 1999).

Taking conscious experience as the result of the complex of relations among brain systems, body systems, social systems and culture, the ‘cement’ that unites them is stored memory. Relevant brain processes operate by virtue of neural modifiability that results in the brain’s memory store. Physical, biological and social consequences of behavior are memorable, both in changing brain organization and in changing culture. The cultural consequences that have developed, such as new technologies and new linguistic usages, feed back on the brain to alter its memory store and the consequent brain processes feed forward onto culture. Karl Popper incorporated this theme into ‘three interacting worlds’: culture, brain and mind (Popper and Eccles 19 77); see also, Pribram 1971 Chapters 2, 14,15, and 20).

Levi-Strauss pointed out that a reductionist approach, in which causal relations are sought, works for simple systems, but that for complex systems a structural analysis is needed. Consciousness is complex. The search for efficient causes is misplaced. Formal causes, embodied in the structure of the transactions and, on occasion, even final causes are more appropriate: The brain process intends the culture as in originally designing a piano (a cultural material memory) and in subsequently creating a piano sonata (a cultural mental memory). As I attend a happening and remember it, my brain’s memory systems are altered and can be altered thoughtfully in such a way that my intended future behavior will be affected propitiously as, for instance, in operant conditioning or more complex self- organizing transactions.

A Spatiotemporal Hold

Given this necessary dependence between mind and matter, and between matter and mind, how do the transactions between them actually take place? How are memories formed?

With regard to their neural locus, the basic transactions between matter and mind occur in the fine fibered branches of neurons (teledendrons and dendrites) and their connections (synapses and ephapses) in the brain (the evidence is reviewed in Pribram 1971 and 1991). However such transactions need to be transmitted between sites by the larger nerve fibers (axons) for interactions among brain sites to occur. These interactions take time. The problem is that axons of various diameters and lengths must synchronize the transmission of the basic transactions. Rodolfo Llinas (2000; Pellionez and Llinas 1979; 1985) has developed a tensor model that takes account of these differences in transmission. Llinas’ suggests that transmission time in the nervous system cannot be our experienced time, but rather something more like an Einsteinian, Minkowsy spacetime.

Llinas’ model and mine both develop invariants by utilizing a convergence of sensory inputs and motor outputs to form entities, that is, objects. Mathematically this is described as linear covariation among sensory inputs (by both Llinas and Pribram) and nonlinear contravariation among motor outputs (by Llinas mathematically and by Pribram neurologically). Input and output converge on a ‘world point’ that provides for our perceptions of objects. Llinas’ theory and mine are complementary both with regards to their neuroanatomical referent and their mathematical description. I have elsewhere detailed this complementarity (Pribram 2004).

On the behavioral side, a stepwise process can be discerned to occur. Operant conditioning provides a useful example of this process. Fred Skinner (1989), a radical behaviorist, stated: ‘there are two unavoidable gaps in the behavioral account; one between a stimulating event and the organism’s response; the other between the consequences of an act and its effect on future behavior. Only brain science can fill these gaps. In doing so it completes the account; it does not give a different account of the same thing.’

In the 1960s and 1970s I proposed that a temporal hold characterizes the brain processes that fill these gaps. With regard to the first gap, that between a stimulating event and the organism’s response, the gap is negligible when the response is reflexive, habitual and/or automatic. However when awareness is experienced the temporal hold becomes decisive. Sir Charles Sherrington phrased the issue as follows: ‘Between reflex [automatic] action and mind there seems to be actual opposition. Reflex action and mind seem almost mutually exclusive—the more reflex the reflex, the less does mind accompany it.’ I noted that habitual performance results from the activity of neural circuits in which the currency of transaction is the nerve impulse as in Llinas’ tensor theory. Mind, awareness, on the other hand, demands a delay between processing patterns arriving at synapses and those departing from axon hillocks. The delay occurs in the fine fibered connectivity within circuits. (The evidence supporting these statements in reviewed in Pribram 1971, Chapter 6, pp 104-106).

Regarding the second gap, recently performed experiments have shown that during learning such a delay is imposed by the frontal cortex on the systems of the posterior convexity (reviewed in Pribram 1999). For example, a visual input activates the occipital cortex, then the frontal cortex, and then again the occipital cortex. As a consequence, in a learning task such as during operant conditioning, the frontal cortex has been shown to be critically involved. In these experiments an experimenter interposes a delay between stimulus presentation and the opportunity for response. Monkeys readily master such tasks. But when the frontal cortex is temporarily anesthetized learning fails to occur. This has been shown to be the result of a failure of a frontal ‘hold’ to occur, a hold that ordinarily activates cells in the more posterior parts of the brain during the delay part of the task.

In spacetime terms, the temporal hold is closely related to a spatial hold. Time is measured as the duration of a movement through space (like on an analogue clock face). The spatiotemporal hold can help explain Ben Libet’s findings (1966) which have puzzled neuroscientists for decades. Libet showed that selective electrical stimulation of the somato-sensory cortex is not ‘sensed’ for a quarter to a half second after the onset of the stimulation, whereas peripheral stimulation is sensed immediately. Libet showed this to be due to a backward projection in time to the occasion of the stimulation.

This sort of backward projection was shown by Bekesy (1967) to occur regularly in auditory perception. Bekesy used a device that cancelled the ordinary everyday spatial projection of auditory sensation away from the apparatus that does the sensing. Such projection is familiar to us in the case of stereophonic audio systems: sound appears to us to emanate from space away from the speakers and certainly away from the receiver that transduces the radio signals. Bekesy showed that the biological perceptual apparatus, be it auditory, visual or tactile operates in the same fashion. Of interest with regard to Libet’s experiments is that when Bekesey cancelled out this spatial projection and tried to cross a busy street blindfolded, he could not anticipate the arrival of a car; the sound hit him so suddenly that he doubled over as if he had actually been struck by the car. Projection is an attribute of the spatiotemporal hold that is built into perception.

Another facet of the spatiotemporal hold that appears to be essential to conscious experience is that peripheral stimulations immediately engage a much larger cortical field than do Libet’s cortical stimulations. Electrical excitation of the sciatic nerve, for instance, evokes reposes over the entire central part of the cerebral convexity (including the so-called motor and premotor cortex) even in anesthetized monkeys (Malis, Pribram and Kruger (1953). In addition, my colleagues and I identified a mediobasal (limbic) motor cortex that governs viscero-autonomic processing involved in conditioning, learning and remembering (Kaada, Pribram and Epstein 1949; Pribram, Reitz, McNeil and Spevak 1979 reviewed by Pribram 2003).

But these mediobasal and classical precentral ‘motor’ cortexes do more than control particular movements or viscero-autonomic effects per se. They are, in fact, sensory cortexes controlling action, that is, the projected achievement of a target (the evidence for this statement is reviewed in Pribram 1971 and 1991). As such they encode what Skinner called ‘behavior,’ the environmental consequences of an action, enlisting whatever movements necessary to carry out the act. (Skinner stated that for him, the behavior of an organism is the paper record of accumulated responses that he took home to study.) When the precentral process contributes to awareness, if at all, it is of the errors in the environmental consequences of the behavior not the trajectory of the sensory receptors (the scans) or the movements by which to accomplish a percept or an action. The parallel in vision is that we do not sense the trajectory of saccades, only the visual image effected by them. Visual image and the environmental consequences of an action come to awareness some time later than the saccade and the movement itself.

Stimulation of the classical central (Rolandic) sensory and motor cortex should not be coordinate with awareness. If we were aware of our actions at the time they are occurring, we would mess them up (Miller, Galanter and Pribram 1960). Imagine being aware of your tongue and palate as you are giving a talk—in fact occasionally when your mouth becomes dry, you do become aware and just can’t go on. Or playing tennis or batting at baseball—the adage is ‘keep your eye on the ball.’ When taking notes during a lecture, conscious attention is on what the lecturer is saying not the writing of notes. Furthermore, the receptors and muscle contractions, the movements involved can vary according to whether one is watching a video, listening to the professor, using a writing pad, a laptop computer or standing at the blackboard. To repeat: the primary sensory and motor systems provide the encoded intended consequences of an action, not just the particular movements needed to carry them out. Thus, these systems need to function autonomously during the course of an action; only when, after a spatiotemporal hold, they act in concert with other brain systems do they participate in organizing any necessary change in future acts by way of conscious intervention.

There is another piece of evidence that supports the involvement of a spatiotemporal hold in achieving both conscious experience and learning. When we first began to study event related brain electrical potential changes (ERPs) we learned a great deal by using what is called an odd-ball technique. A particular stimulus is presented repeatedly and occasionally a different (but somewhat related) stimulus is randomly interposed in the series. The recorded ERPs are then averaged separately for the two types of stimulus presentation. The averaged records for the two types of stimuli are dissimilar especially around 300 millisec. after the presentation of the stimulus. We interpreted the change in the ERP for the odd-ball stimulus as indicating that an update in the perception of the stimulus sequence was occurring. But subsequent experimentation showed that another dissimilarity in ERPs could be observed at around 400 millisec. and that updating did not occur (i.e the dissimilarity at 300 msec. continued unchanged) unless the 400msec. dissimilarity was present. In short, the dissimilarity at 300 millisec. indicates that an update (often consciously experienced) is necessary and the one that occurs at 400 millisec. heralds the actual updating (that is, learning is occurring). According to all this and much other evidence (reviewd by Pribram and McGuinness 1992) achieving conscious awareness involves specific brain systems and takes processing time.

The Road to Supervenience

Ignoring the spatiotemporal hold has led some philosophers to opt for one of two very different accounts of the relationship between our conscious experience and brain processes: one such account states that, in fact, there is no relationship, that conscious experience is a useless epiphenomenon (reviewd but not adopted by Velmans, 2000) which in the extreme would hold that consciousness was invented by God to torture us. Another attempted explanation, somewhat less extreme, is that the relationship simply exists by virtue of supervenience, that is, an immediate, unexplained, downward causation.

The problem with the epiphenomenon view is that there is much evidence against it: Often the pen is mightier than the sword. We imagine musical instruments and musical phrases and stir others when we implement them.

As for supervenience, the major problem is to account for ‘how’ it is accomplished (Velmans 2000). Just what might be the relation between ineffable conscious mind and palpable matter? According to the view proposed earlier in this essay, the answer lies in the complementary relationship between matter and mind as a two way dependence of ex-formation (matter) on in-formation (communication) and information on ex-formation—in less technical language: on information processing by virtue of brain processes. Pursuing this formulation of the issue goes a long way toward resolving this issue.

But another issue needs a different resolution. Much of what we consciously experience is indeed an emergent epiphenomenon (unless we can get to use it in a talk show). Non-conscious, automatic processing takes up the major portion of the brain’s metabolism. The question so put asks: What brain systems and processes are responsible for that aspect of conscious experience that supervenes on brain processes so as to modify them for future use and how does it do so?

My answer is that much of conscious experience is only initially epiphenomenal; and further, that supervenience occurs by virtue of the spatiotemporal hold. It is the hold that allows behavior, defined as the consequence of the organism’s action, to mediate the registration of an experience so that it becomes available to the brain’s future processing. Conscious experiences are initially emergent from brain processes produced by input generated in the brain by physiological, chemical, physical and sociocultural environment . When changes occur in that environment changes are produced in the brain processes. Only when these peripheral changes become implemented in the brain’s memory do the resultant experiences become accessible to further processing.

Implementation is stepwise: The environmental patterns that form conscious experience induce a neural pattern, that is, a temporary dominant focus, an attractor. The neural pattern develops over 300 msec. when novelty is encountered (novelty can be generated internally as when there is a shift in perception of an ambiguous figure such as a Necker cube.) Over another 100 milliseconds the attractor, the temporary dominant focus, gains extended control over brain processing, for instance through spectral phase locking between frontal and posterior cortices. (For an interesting, somewhat less brain detailed view somewhat similar to mine, see Gabora, 2002.)

Consciousness of an experience, when attained, thus can affect subsequent automatic brain/behavioral processes by virtue of gaining control over them, allowing changes to occur consonant with the experienced novel context. In a sense the experience itself is momentarily an epiphenomenon: though produced by inputs to the brain, these brain patterns and the conscious experiences are ordinarily fleeting and do not immediately become coordinate with any lasting brain patterns. The effect of the conscious experience has to become proactive.

Thus supervenience is not effected by some immediate conscious mental pattern being impressed on (or matched to) a pattern of brain processes. Supervenience depends on a spatiotemporal hold that makes possible several shifts in brain processing away from sensory and viscero-autonomic inputs, shifts in the location of temporary dominant foci (attractors). These shifts allow the brain patterns coordinate with the initial experience to co-opt other brain processes that ultimately control consequent behavior. Behavior, in turn, modifies viscero-autonomic and sensory processes that in their turn modify subsequent shifts in attractors until the novel experiences become implemented. More on this under ‘Attention.’

MODES OF CONSCIOUS EXPERIENCE 

Types of Brain Organizations

In earlier presentations (Pribram 1976 ;1977) I identified three modes within which the attractors operating in the brain help organize our experience. These modes are states, contents and processes. Conscious states are organized primarily by neurochemical states. The wealth of psychopharmacological influences on moods such as depression and elation, attests to this relationship. The biochemical and biophysical substrates of anesthesia, sleep and dreaming are being investigated at the synaptic, dendritic, membrane, channel and microtubular scale (see for example, E. Roy John 2000; Hameroff 1987; Hameroff and Penrose 1995; Jibu, Pribram and Yasue, 1996). This active field of investigation is handled in those publications.

The contents of consciousness, ordinarily spoken of as perception, are addressed by DeValois and DeValois (1988) and in my book ‘Brain and Perception’ (1991). The wealth of evidence is reviewed that has been obtained in my laboratory and those of others, on how different brain processes influence the organization of the contents of consciousness.

In a more recent paper entitled ‘Brain and the Organization of Conscious Experience’ (Pribram 1999) I addressed additional issues that, from the traditional philosophical perspective, are addressed under the concept ‘intentionality.’Intentionality is, in these traditions, defined differently from intention in the sense of purpose (which will be discussed later in the current essay under the topic ‘Free Will’). Also ‘intensionality’ (addressed under the heading ‘Attention’) differs from intentionality. Intensionality concerns the intensive aspects of experience that are contrasted to its extensional aspects.

To return to the concept ‘intentionality’: Franz Brentano (1929/1874) noted that just as all of our intentions need not be actualized, so also our perceptions are directed toward an object but the object need not be realized. Brentano spoke of ‘intentional inexistence’ using the parallel between unfulfilled intended acts and unfulfilled perceptions as in imagining a unicorn. Husserl simplified Brentano’s term to ‘intentionality’ again emphasizing that the process need not refer to an actual sensory input.

Human clinical, and experimental evidence obtained with non-human primates, has shown that the systems centered on the posterior convexity of the brain are involved in the intentional aspect of conscious experience. Separate such brain systems can be distinguished: those controlling egocentric (body-centered), allocentric (beyond the body, outer-centered) and object centered ‘spaces.’ By contrast the basal frontolimbic forebrain is critically involved in the intensive aspects of experience in terms of experiencing episodes of novelty and of disruption of ongoing processes. These same parts of the brain control a readiness to bind these episodes together into a narrative. (Pribram 1999; Pribram and Bradley 1998); Newberg and d’Aguili 2001; Koechlin, Ody and Kouneiher, 2003).

However, these very same brain organizations are molded by biological and social factors that are, in turn, organized by the brain organizations. Human brains are critical to the invention of bicycles, the writing of novels and the construction of economic systems that, in turn, mold brain organization. The phenomenological approaches to conscious experience taken by Brentano, and his followers, Husserl (1931) and Heidegger (1966) acknowledge these interrelationships, but do not detail the necessary experimentally based data (especially what the brain and behavioral sciences currently have to offer) that pulls it all together.

Nor does their ‘Lebenswelt’ (Husserl) or ‘In-der-Welt Sein’ (Heidegger) detail the structural precision of the processes that are involved in the reach from being to becoming in the material aspects of the world that is the essence of Ilya Prigogine’s contribution (e.g. Prigogine 1980; Prigogine and Stengers 1984). Prigogine provides such structure in descriptions of self-organizing systems forming far from equilibrium. Further details of such processes have been worked out (see for instance conference proceedings edited by Pribram 1994 and by Pribram and King 1996) in terms of phase spaces that contain attractors that ‘pull’ rather than causally ‘push’ organizational complexity. What needs to be done is to place these data based advances in the theory of matter into the framework of phenomenological analysis in order to forge a comprehensive theoretical frame for a science of psychology (see Heelan 1963; Pribram 1981).

The processing mode of conscious experience binds state with perceived content and content with state. I am in a state of hunger and thirst and suddenly perceive hitherto ignored restaurant signs all over the place even when they are in written in the Russian alphabet (the Zeigarnik 1972 effect). I am on my way to work, urgently considering the day’s tasks when I pass a doughnut shop. Perceiving the fresh baking odors stops me in my tracks, I perceive the store window with its display and I go in and buy a couple of the doughnuts because now I am in a new state, I feel hungry.

Conscious processing can, in turn, be parsed into 1) attention as pre-processing sensory, kinesthetic and visceral inputs; 2) thinking as pre-processing remembering; and 3) intentions as pre-processing motor output. These pre-processings will form the grist to the mill of the remainder of this essay.

Attention

At one point William James exclaimed that he was tired of trying to understand consciousness and that we should stick to understanding attention. But, of course, he did no such thing. Nonetheless attention is a good starting place to examine the issue as to how in-formation supervenes on the ex-formation of the brain.

Skinner’s realization that brain is critically involved in the operant conditioning process provides the key to explanation. Experience does not immediately supervene on neural processing during a perception or an action. Rather, at any moment, neural patterns are generated by a novel and unexpected sensory input or composed by an internal set of ongoing events. These neural patterns act as temporary dominant foci, as attractors. (A review of the experimental evidence that leads to the concept of a temporary focus is in Pribram 1971 Languages of the Brain pp 78-80). The neural circuits involved operate efficiently to preprocess further sensory input or preprocess an action. Ordinarily these preprocessings proceed with no time for ‘mind to accompany them; as Sir Charles Sherrington so eloquently put it. Once preprocessing is completed, control shifts automatically to other patterns in response to current demands—unless there is an intervention by some novel happening (see Miller, Galanter and Pribram 1960: Plans and the Structure of Behavior). Much has been made of an action-perception cycle. By contrast, what I am emphasizing here is an automatic-conscious processing cycle.

The automatic (unconscious) processes are, at any moment, more like feedforward programs than error sensitive processes subject to correction by feedback. We are not aware of the process by which we prehend an object. As noted, this is a good thing—we’d only mess-up. So, does that leave us with all conscious experience as an epiphenomenon? Not at all. After I reflexly remove my hand from a hot flame, I contemplate the happening. Our cat is an excellent example: he looks at his paw and licks it—then looks at the offending object and reaches out toward it but this time does not touch it. He repeats this procedure several times over. If I may anthropomorphize, the cat’s conscious awareness of the incident, his ERP at 300 millisec. and the later change at 400 millisec. indicate how awareness of what has happened alters (preprocesses) future behavior. In the example given, the cat’s attentional preprocessing reinforced the change in subsequent behavior several times—in operant conditioning terms the cat’s observational behavior was shaping the changes in his behavior. And the non-behavioristic claim is that the shaping can occur, not only non-consciously, but also by way of conscious awareness of what is happening. Conscious attention shapes subsequent behavior.

Thinking

What about the patterns that characterize our thought processes? Do they supervene directly onto patterns of brain activity? Freud as well as many others defined thought as implicit action and based his talk therapy on that principle. According to the view that I have here assembled, implicit action, remains implicit, that is, we remain unaware of the ongoing processing. Freud pointed out that these preprocesses constitute the person’s memory when looked at retrospectively and at the same time they are that person’s motivation when looked at prospectively (Pribram and Gill 1976). When we become aware of the results of this preprocessing, that is, when we consciously think about something, we actually do involve the body’s effectors, muscles and glands. Watson was not far off in his physiological behaviorism. Evidence continues to accumulate that very slight changes in muscle tone or in breathing or heart rate variability occur during thinking (See, for example a review by McGuigan 1976). William James and more recently Antonio Damasio have called attention to the involvement of feelings as bodily responses to what happens and how these feelings influence cognition and the making of choices (Damasio, 1994). What is being made explicit is that these physiological body responses change brain preprocessing (re-membering) so that subsequent thinking becomes modified.

My claim therefore is akin to that made by William James: but adds whole body attitudinal inputs and additionally, the environmental consequences of behavior. For the preprocessing of memory that motivates a thought to become conscious it must be ‘taken to heart’ and acted upon through viscero-autonomic, gestural or subvocal acting out. Action on the body and on the world must take place, albeit sometimes only subliminally and tentatively, to shape the memory-motive structures, the temporary dominant foci, the attractors characterizing the patterns of brain pre-processes in thinking.

Karl Popper is close to this formulation in his concept of the necessary interaction among ‘Three Worlds’ to achieve consciousness. Popper’s three worlds were brain, culture and mentation. Contemporary suggestions indicate that a ‘Fourth World,’ the body, must be added to these interactions. Thus and only thus can the pen be mightier than the sword.

Intention as Free Will

Taking the primacy of conscious experience as the starting point of inquiry resolves not only such issues as ‘downward’causation as handled in the previous sections, but also the issue of free will. A scientific reading of what constitutes freedom would state that although one’s actions are constrained in a variety of ways, the measure of the degrees of freedom that remain is experienced as free will. Voluntary, intended, behavior rests on a parallel feed-forward pre-process in which a signal presets the execution of the process (Helmhotz 1909/1924; Sperry 1947; Teuber 1964). Helmholtz used as example the saccadic movements of the eyes that place the retina where it needs to be to receive the ‘desired’ input, that is, the target of the intent.

Much has been made of the fact that brain processes can be recorded prior to the execution of a voluntary act. But, as noted, thank goodness my behavior is not burdened with continuous conscious experience appropriate to the behavior. Even my spontaneous lectures in a classroom run off at a rate that would be impeded by any awareness of how I am saying something. Awareness comes from watching the faces of the students—I must slow down, ask for questions etc

In more technical terms, contrary to Einstein’s view, God does play dice with the universe and with you and me. The six-sided die even has numbers on it—it is highly constrained, determined. But throw the dice (two of them) and you have a great many possibilities as to how they will land. The initial conditions are determined by the six-sided dice—the throw, the dynamics, are constrained only by gravity and the gaming table, and, for all practical purposes remain remarkably undetermined. And conscious experience, because it comes late allows humans to influence future contextual constraints (the gaming table) on the basis of their experience—else how would casinos stay in business?

In short, my claim is that freedom comes from action, from doing something with the constrained anatomy, the structure of the situation. As described by non-linear dynamics, the future is dependent on initial conditions and the constraints operating at any moment. These determine the degrees of freedom, the state space within which the trajectories of the process must operate. Equally important is the noise in the system so that the action is not constrained only by the first attractor that is encountered (the first well in the landscape—that is why the roulette wheel is actively rotating). In experiments my colleagues and I performed (Pribram, Reitz, McNeil and Spevack, 1979) studying classical conditioning in amygdalectomized monkeys the animals to failed to become conditioned. The failure was shown to depend on the reduction (when compared with the behavior of normal control subjects) of variability (noise) in their initial responses to the unconditioned stimulus and were therefore unable to bridge the time gap necessary for them to connect the conditional stimulus to the unconditioned stimulus. Simply put, they were more constrained, more compulsive than their controls.

CODA

Pervasive Consciousness

Defining an aspect of conscious experience in terms of narrative indicates that experience partakes of a larger consciousness, tunes into that more encompassing knowing together. Taking the stance that I have done in this essay, there is only a step from the existential conscious experience of living in this physical, biological, social and cultural world to defining the cultural world as spiritual. By spiritual I mean that our conscious experience is attracted to patterns (informational structures?) beyond our immediate daily concerns. Such patterns may constitute quantum physics, organic chemistry, history, social interactions, economics or religious beliefs. These interests all comprise stories and the same part of the brain that is involved in creating the narrative ‘I’ is involved in partaking in these other narrative constructions (see 1998, Pribram and Bradley 1998).

The search for understanding is indeed a spiritual quest whether esoteric, artistic or scientific. Understanding consciousness as developed in this essay ought to go a long way toward unifying these quests.

Summary

Taking conscious experience as primary from which all (including scientific) knowing is derived, resolves many of the issues now so fervently debated by both philosophers and scientists. First: the privacy of conscious experience is contested: in fact, it is conscious experience, as opposed to unconscious processing, that is the medium of communication.

Second: communication, minding, replaces ineffable mind in portraying the matter/mind relationship. Mapping this relationship by means of the Fourier transformation, matter can be seen as an ‘ex-formation’ constructed from a basic holographic-like flux from which minding is also constructed as ‘in-formation.; Matter and minding are mutually dependent on one another: it takes mathematical minding to describe matter and minding, communication, cannot occur without a material medium.

Third: Given this portrayal of the matter/minding relationship, certain aspects of information processing by the brain can be tackled. To begin, the timing within circuits of the brain cannot be within experienced time (duration) because the paths of conduction are varied with regard both to length and fiber size that determine speed of conduction. If communication of a pattern and/or synchrony is to be achieved, a higher order spacetime such as that developed in relativity theory must be operative.

This is one indication that the brain processes coordinate with conscious experience must be forged much as a musical instrument must be forged to provide a medium for the production of music. Forging takes place within spacetime and involves not only brain processes per se but inputs from and outputs to the body as well as the physical and sociocultural environment.

On the basis of identifying brain systems coordinate with consciousness, at least two modes of experience can be identified: a) an ‘objective me’ distinguishable from an objective other, and b) a ‘monitoring, narrative I’ constructed of episodes and events. Paradoxically, the same brain processes that are coordinate with the ‘monitoring, narrative I’ are also coordinate with experienced spirituality.

References

Barrett, T.W. (1993) Is Quantum Physics a Branch of Sampling Theory? In Delanoue, G. Lochak & Lochak (eds) Courants, Amers, Ecueilsen Microphysique, Fondation Louis de Broglie, Paris.

Bekesey, G. (1967) Sensory Inhibition. Princeton University Press, Princeton NJ.

Bohm, D. (1973) Quantum Theory as an indication of a new order in physics. Part B. Implicate and Explicate Order in physical law. Foundations of Physics, 3,pp. 139-168.

Bohr, N. (1961) Atomic Physics and Human Knowledge. Science Editions, New York.

Brentano, F. 1929/1874 Sensory and Noetic Consciousness, ed. O.Kraus, trans. M. Schattie & L.L. McAlister. Humanities Press, New York.

Chalmers, D.J. (1996) The Conscious Mind: In Search of a Theory of Conscious Experience. Oxford Press, New York.

Chapline, G. (1999) Is theoretical physics the same thing as mathematics? Physical Reports 315, 95-105.

Churchland, P.M. (1995) The puzzle of conscious experience. Sci. Am. 273:80-86.

Churchland P.S. (1986) Neurophilosophy. MIT Press, Cambridge.

Crick, F. (1994) The Astonishing Hypothesis. Scribner, New York.

Damasio, A.R. (1994) Descartes’ Error: Emotion, Reason and the Human Brain. G.P. Putnam’s Sons, New York.

Descartes, R. (1972/1662) Treatise on man. T.S. Hall trans. Harvard University Press, Cambridge.

DeValois, R.L. and DeValois, K.K. (1988) Spatial Vision (Oxford Psychology Series # 14) Oxford University Press, New York.

DeValois, R.L. and DeValois. K.K. (1993) A multistage color model.

Gabor, D.(1946) Theory of Communication. Journal of the Institute of Electrical Engineers, 93,429-441.

Gabor, D. (1948) A new microscopic principle. Nature, 161, 777-778.

Gabora, L. (2002) Amplifying Phenomenal Information: Toward a Fundamental Theory of Consciousness. Journal of Consciousness Studies 9, no 8, pp. 3-29.

Hameroff, S. R. (1987) Ultimate Computing: Biomolecular Consciousneas and Nanotechnology. Elsevier Science Publishers, B.V. Amsterdam.

Hameroff, S. and Penrose, R. (1995) Orchestrated reduction of quantum coherence in brain microtubules: a model for consciousness. In King, J.S. and Pribram, K.H. Is the brain too important to be left to specialists to study? The third Appalachian Conference on Behavioral Neurodynamics. Lawrence Erlbaum Associates, Mahwah NJ.

Heelan, P. A. (1963) Space-Perception and the Philosophy of Science. University of California Press Berkeley, Los Angeles, London.

Heidegger, M. (1966) Discourse on Thinking. Trans, from Gelassenheit by J Anderson and E.M. Freund. Harper & Row, New York.

Helmholtz, H. (1909/1924) Handbook of Physiological Optics (3 rd Edition J. P. C. Southall) Optical Society of America, Rochester NY.

Husserl, E. (1931) Ideas: A General Introduction to Pure Phenomenology. Trans. W.R. Boyce Gibson. Allen and Unwin. London.

Heisenberg, W. (1930) The Physical Principles of the Quantum theory, Dover Publications, London.

Helmholtz, H. (1909/1924) Handbook of Physiological Optics (3 rd Edition J. P. C. Southall) Optical Society of America, Rochester NY.

King, J. S. & Pribram, K.H., (Eds.) (1995) Scale in Conscious Experience: Is the Brain Too Important to be Left to Specialists to Study? , The Third Appalachian Conference on Behavioral Neurodynamics: Lawrence Erlbaum Associates, Inc. New Jersey.

Jibu, M., Pribram, K.H., & Yasue, K. (1996) From conscious experience to memory storage and retrieval: The role of quantum brain dynamics and boson condensation of evanescent photons. International Journal of Modern Physics B, Vol. 10, Nos. 13 & 14, pp. 1735-1754.

John, E.R. (2001) A field theory of consciousness. Consciousness and Cognition 10 (2) 184-213.

Land, E.H. (1986) Recent advances in retinex theory. Vision Research 26: 7-22.

Libet, B. (1982) Brain stimulation in the study of neuronal function for conscious sensory experiences. Human Neurobiology 1, 235-42.

Llinas. R.R. (2001) I of the Vortex: From Neurons to Self. MIT Press, Cambridge.

Kaada, B. R., Pribram, K. H. & Epstein, J. A. (1949) Respiratory and vascular responses in monkeys from temporal pole, insula, orbital surface and cingulate gyrus. J. Neurophysiol., 12, pp. 347-356.

Koechlin, E., Ody, C., Kouneiher, F., The Architecture of Cognitive Control in the Human Prefronatal Cortex. Science, Vol. 302 pp. 1181-1184.

Mach, E. The Analysis of the Sensations, The Monist, I, 48-68.

Malis, L.I., Pribram, K. H., & Kruger, L. (1953) Action potential in “motor” cortex evoked by peripheral nerve stimulation. J. Neurophysiol., 16, pp. 161-167.

Miller, G. A., Galanter, E. & Pribram, K. H. (1960) Plans and the Structure of Behavior. New York: Henry Holt, 1960. (Russian trans; also in Japanese, German, Spanish, Italian.

McGuigan, F.J. (1976) The Function of covert oral behavior in linguistic coding and Internal information processing. Psychology in Progress: An Interim Report. New York Academy of Sciences 270 57-89 NY.

Moghaddam, F.M. (2003) In Culture and Psychology Vol 9 (3): 221-232 SAGE publications London, Thousand Oaks, CA and New Delhi.

Pellionisz, A. and Llinas, R. (1979) Brain modeling by tensor network theory and computer simulation. The cerebellum: Distributed processor for predictive coordination. Neuroscience 4: 323-348.

Pellionisz, A and Llinas R. (1985), Tensor network theory of the metaorganization of functional geometries in the CNS. Neuroscience 16: 245-273.

Popper, K.R. and Eccles, J.C. (1977) The Self and Its Brain: An Argument for Interactionism.

Pribram, K.H. (1959) On the Neurology of Thinking: Beha, Sci., 4, pp 265-287.

Pribram, K. H. (1971) The realization of mind. Synthese, 22, pp. 313-322.

Pribram, K. H. (1971) Languages of the Brain: Experimental Paradoxes and Principles in Neuropsychology. Englewood Cliffs, NJ: Prentice-Hall; Monterey, CA: Brooks/Cole, 1977; New York: Brandon House, 1982. (Translations in Russian, Japanese, Italian, Spanish).

Pribram, K. H. (1976) Problems concerning the structure of consciousness. In G. Globus, G. Maxwell, & I. Savodnik (Eds.), Consciousness and brain: A scientific and philosophical inquiry. New York: Plenum, pp. 297-313.

Pribram, K. H. (1977) Alternate states of consciousness: Some observations on the organization of studies of mind, brain and behavior. In N. E. Zinberg (Ed.), Alternate states of consciousness. New York: The Free Press, pp. 220-229.

Pribram, K. H. (1981) Behaviorism, phenomenology and holism in psychology: A scientific analysis. In R. Valle & R. von Eckartsberg (Eds.), The Metaphors of Consciousness. York: Plenum, pp. 141-151.

Pribram, K. H. (1986) The cognitive revolution and mind/brain issues. American Psychologist, Vol. 41, No. 5 , pp. 507-520.

Pribram, K. H. (1991) Brain and Perception: Holonomy and Structure in Figural Processing. New Jersey: Lawrence Erlbaum Associates, Inc.

Pribram, K.H. (ed.) (1994) Origins: Brain and Self-Organization.

The Second Appalachian Conference on Behavioral Neurodynamics. Lawrence Erlbaum Associates, Publishers, Hillsdale NJ.

Pribram, K.H. (1997) What is Mind that the Brain May Order It?. In V. Mandrekar & P.R. Masani (Eds.) Proceedings of Symposia in Applied Mathematics, Vol. 2: Proceedings of the Norbert Wiener Centenary Congress, 1994 . Providence, RI: American Mathematical Society, pp. 301-329. Reprinted: The Noetic Journal , Vol. 1, June 1997, pp. 2-5.

Pribram, K.H. (1997) The Deep and Surface Structure of Memory and Conscious Learning: Toward a 21 st Century Model. In Robert L. Solso (ed.) Mind and Brain Sciences in the 21 st Century. MIT Press, Cambridge.

Pribram, K.H. (1999) Brain and the Composition of Conscious Experience. Journal of Consciousness Studies , 6, No. 5, pp. 19-42.

Pribram, K.H. (2003) Forebrain psychophysiology of feelings: interest and involvement. International Journal of Psychophysiology 48 115-131 Elsevier Science B.V.

Pribram, K.H. (2004) Brain and Mathematics. In Brain and Being: The boundary between Brain, Physics, Language and Culture. Eds. Globus, Pribram and Vitiello accepted for publication.

Pribram K.H. (2004) Freud’s Project in the 21 st Century – accepted for publication.

Pribram, K.H. (2004) Freud’s Neurophysiogy – accepted for publication.

Pribram, K.H., Xie, Zheng, Santa Maria, Hovis, Shan and King, (2004) accepted for publication.

Pribram, K.H. & Bradley, R. (1998) The Brain, the Me, and the I. In M. Ferrari and R.J. Sternberg (Eds.) Self-Awareness: Its Nature and Development . New York: The Guilford Press, pp. 273-307.

Pribram, K. H. & Gill, M. M. (1976) Freud’s `Project’ Re-Assessed: Preface to Contemporary Cognitive Theory and Neuropsychology. New York: Basic Books.

Pribram, K.H. and King J.S. (eds.) (1996) Learning as Self-Organization. The Fourth Appalachian Conference on Behavioral Neurodynamics.

Lawrence Erlbaum Associates, Publishers, Mahwah, NJ.

Pribram, K. H. & McGuinness, D. (1992) Attention and para-attentional processing: Event-related brain potentials as tests of a model. In: D. Friedman & G. Bruder (Eds.), Annals of the New York Academy of Sciences, 658 (pp. 65-92). New York: New York Academy of Sciences.

Pribram, K. H., Reitz, S., McNeil, M. & Spevack, A. A. (1979) The effect of amygdalectomy on orienting and classical conditioning. Pavlovian J. Biol. Sci., 14, pp. 203-217.

Pribram, K.H., (2004) Xie, Zheng, Santa Maria, Hovis, Shan and King, Forma, accepted for publication.

Prigogine, I. (1980) From Being to Becoming: Time and Complexity in the Physical Sciences W.H. Freeman and Co. San Francisco.

Prigogine, I. And Stengers, I. (1984) Order out of Chaos. Heinemann. London.

Ratliff, F. (1965) Mach Bands: Quantitative studies on neural networks in the retina. Holden-Day Inc. San Francisco, London, Amsterdam.

Russell, B. (1948) Human Knowledge, Its Scope and Limits, Simon and Schuster, New York.

Skinner,B.F. (1989) The origins of cognitive thought. American Psychologist. 44 (1), 13-18.

Sperry, R.W. (1980) Mind/Brain interaction – Mentalism, yes – Dualism, no Neuroscience, 2, 195-206.

Stapp, H.P. (1997/1972) The Copenhagen Interpretation. American Journal of Physics 40 (8), 1098-1116.

Stapp. H.P. (1997) The Journal of Mind and Behavior, Vol.18, Nos. 2 and 3: pp. 171-194.

Teuber, H.L. (1964) The riddle of frontal lobe function in man. In J.M. Warren and K. Akert (Eds.) The Frontal Granular Cortex and Behavior. McGraw-Hill pp. 410-444 New York.

Velmans,M. (2000) Understanding Consciousness. Routledge, London and Philadelphia.

Wigner, E. (1967) Symmetries and Reflections. Indiana University Press, Bloomington, IN.

Wilmer E.N. (1946) Retinal structure and Color Vision : A Statement and an Hypothesis Cambridge University Press.

Zeigarnik, B.V. (1972) Experimental Abnormal Psychology Plenum New York.

Author bio

Karl H. Pribram, Distinguished Research Professor, Cognitive Neuroscience, Department of Psychology, Georgetown University; and School of Computational Sciences, George Mason University Professor Emeritus Stanford and Radford Universities