Regenerative Semantics & Generative Grammer in Proto Linguistic Organ

1. Gyrus cinguli           

2.Corpus callosum

3. Capsula interna                    

4. Putamen

5. Nucleus anterior thalami

6. Amygdala

7. Insula

8. Lobus temporalis

9. Nucleus caudatus

10. Capsula externa

11. Claustrum

12. Commissura anterior

13. Pars medialis globi pallidi

14. Infundibulum

15. Columna fornicis

16. Tractus opticus

17. Cornu inferius ventriculi lateralis

18. Ventriculus lateralis

19. Fornix

20. Pars lateralis globi pallidi

Introduction.

          Any comprehensive philosophy of natural languages, or meta-linguistics, must include considerations on syntax, semantics, referentials, phonology, truth values and pragmatics. Of these the most important and puzzling component remains semantics, a theory of meaning. We asked in a previous publication, what is it about certain marks, figures or a noise that endows them with such distinctive meanings? In our opinion, as we will expand on below, the most successful answers have identified in all of them ‘propositional attitude’ atomic particles which are neither language neutral nor analytically divorced from the reality it struggles to represent (see Quine famous book “Word and Object”). Having considered all historical arguments on ‘meaning’, whether ‘referentialism’ (words refer to things and their relationships  in an actual or possible world) or Wittgenstein’s ‘use’ (a conventionally assigned value within an existing social practice) and variations thereof, it has become clear that whatever theory of meaning we may want to elaborate, it can not be isolated from an obligatory co-evaluation of both ‘syntax’ (grammatical ordering of word relations to achieve maximum consistency for a given language) and ‘pragmatics’ (rules to achieve the maximum meaning content for a particular speech context). A truth-conditional theory of meaning is the un-articulated common denominator of all approaches, usually expressed as the ‘coherence value’ or the way a ‘true sentence’ relates to others in a cognitive context.

          In the decade of the fifties theoretical linguistics, under the leadership of Noam Chomsky, incorporated scientific methodology into the philosophy of language effort and effectively did away with the Skinnerian behaviorist skepticism about the worth of the neurobiological approach to study the brain directly. In so doing, Chomsky opened a new search path to discover the meaning of mind and self-consciousness. However, Chomsky concentrated on the syntax aspect of language structure, describing how natural language practitioners are able to generate and select only well-formed strings of sentences in a recursive manner appropriate for that given language, without much effort on the part of the subject. The implied grammar selection thus generated during the syntactic parsing by the ‘deep structures’ in the brain is considered by us as the possible search for a best-fit semantic content in the generated sentences. Can’t separate one from the other, is the meaning predicated on the syntax structure or vice verse?

          During our continuous search for a definition of mind and consciousness we have stumbled upon many conundrums like the definition of the ‘living’ and the role language plays in the experience of consciousness. We thought the concept of a generative grammar as very attractive, but fuzzy and incomplete. We have tried to supplement and complete that conceptual model by incorporating the amygdaloidal complex as the ‘regenerative semantic organ’ (rso) component part of a more complex ‘proto-linguistic organ’ (plo) that houses the ‘generative grammar organ’ (ggo). Chomsky failed to assign a module locus for the ggo and we believe, and so argue, is located in the hippocampus formation. To do this we have pieced together our own observations with all relevant laboratory and metaphysical research data available. The bigger question of how is this all related to thought generation and consciousness is our long term project of which this thesis is a fundamental part of.

Neurobiological Foundations.

          The ‘amygdaloidal complex’ and its extensions comprise ca. half a dozen nuclei and their subdivisions scattered into cortical and sub cortical areas (see #6 in the Fig. 1 above). Peroxidase staining and other tracing studies since the 80’s have shown their unique connectivity with other brain areas, especially its connection to thalamic sensory input integration areas, reticular activation areas and hypothalamic / limbic cortex loci of coordination and integration of the affective feelings and associated autonomic and endocrine responses linked to emotional behavior. The proto-linguistic organ (plo) is strategically located at the hub of practically every neural activity that is not a motor function. It is particularly noteworthy the amygdaloidal component of the (plo) in the super fast role it plays in the neuro-endocrine modulation of fear, implicit memory and attention when life threatening environmental (or body proper) stimuli are presented to a subject, as we have discussed in other publications. (see also Ledoux ‘The Emotional Brain’). The role it plays in rats sexual behavior points out to another aspect related to the amygdala’s role as life preserving, the reproductive perpetuation of the species. Studies performed with fMRI in humans have established the amygdala’s preference to respond to emotionally charged stimulation. The literature’s well established observations on Kluver-Bucy patients with bilateral amygdaloidal damage where their ability to judge from facial expressions the presence of fear confirms the previous result. The importance this may have in the newborn's ability to read mother's facial expressions in language development and future social adaptive behavior is important to consider. The details on the neuronal network connections integrating the amygdala with various neuro-endocrine sites and the executive cortex can be found in the figure above and at:

The integration of stress by the hypothalamus, amygdala and prefrontal cortex: balance between the autonomic nervous system and the neuroendocrine system.
Prog Brain Res (Netherlands), 2000, 126 p117-32

Neurochemical Foundations.

                Before discussing some relevant experiments we would like to briefly familiarize the reader with some 'archilayer-derived' cell groups having ascending and descending connections of a widespread and diffuse nature called the “brain stem reticular activating system (RAS)”, a most important component of our Module 1. This monoaminergic system include serotoninergic fibers  arising from several raphé (midline) nuclei in midbrain, pons and medulla; noradrenergic and adrenergic fibers from the locus coeruleus and from cells in the lateral tegmental regions; dopaminergic axons from cell groups in  the ventral tegmentum and from the substantia nigra. The axons of the latter project to the striatum and the axons of the former to the cortex. 

          To complete the picture we add now the cholinergic cells in the brain stem with ascending axons, and histaminergic cells in the hypothalamus.  Their importance is in their known capacity to act on thalamus and cortex, playing a role in controlling levels of consciousness by modulating transmission fibers in ascending and descending pathways of the brain.  They are also known to influence transmission of pain sensations and other nociceptive stimulation by descending axons modulating reflexes at spinal levels.

          The well documented participation of the amygdaloidal complex in mediating a fear response when a subject is before a new situation, never experienced before, argues in favor of an inherited, stereotyped reflex adaptive response when the subject is presented with a stimulus representing a potential threat to the survival of the species or its reproductive functions; a genetic (intrinsic) memory linking the species phylogenetic past with the physical environmental present. This response has been shown in newborn chickens never exposed before to light; when they were flashed the animated image of a long-necked, short tailed duck, a familiar view for the species there was no recorded behavioral reaction. When the same image was rotated 180 degrees, (short neck, long tail) resembling now a known predator for the species, and flashed, the animals reacted violently, trying to escape from the ‘predator’. The speed of the responses and their specificity in the modulation of neuro-humoral and behavioral integration is consistent with the view that some aspects of the environmental audio-visual field representing the threatening stimulus get encoded and compared with their genetically-coded audio-visual gallery of ‘icons’ (amygdaloid implicit or genetic memory) associated with pain/pleasure activation responses as detailed elsewhere.

          It is not far-fetched to assume that the same neuro-humoral mediation mechanisms are now responsible for generating adaptive reflex responses against new environmental, life threatening stimulus not experienced by the species before and which, if reinforced by successive exposures, will get incorporated into the explicit (social) memory repertoire of the species. We ignore how this survival information may get incorporated into the human cygote and translated from 'redundant' DNA into relevant neurochemicals to be stored as implicit, heritable information for the subject, but we suspect where. In fact, we know that the delayed component of the fear response (environmental context of new stimulus) in a fear conditioning response is controlled by the hippocampus who we know participates in explicit memory storage. To follow we will select some experimental results on memory consolidation (explicit memory only) to document our thesis.

          Contrary to our expectations, there is now some evidence that the basolateral nucleus of the amygdala is also involved in memory consolidation into what we may describe as ‘explicit’ memory. This is mediated by the nor-adrenergic neurotransmitters as evident from the enhancement of the avoidance responses when nor-epinephrine or b-adrenoceptor agonists are infused into the basolateral nucleus. Alpha-1 adrenoceptor activation has similar effects; both seem to enhance memory during emotion-laden avoidance training in animals. The ACTH produced during the training session activates nor-adrenergic mechanisms and enhance memory retention. The nor-adrenergic systems are known to facilitate the release of opioid peptides and Gaba-ergic neurotransmitters. It has also been established that the neuronal plasticity required for successful emotional memory and learning is present in the amygdala. The induction of neuronal plasticity has been shown to be inhibited by antagonists of the NMDA receptor site. The latter are actively involved in the fear conditioning response. Genetic modifications affecting the structure of synaptic plasticity also affect the amygdala’s adaptive responses to threatening situations.

For more detailed information see: J.Biol Psychiatry 1999 Nov 1;46(9):1140-52 and Trends in Neuroscience 1999 Dec;22(12):561-7

           The specification of the neurochemical features of the amygdala is less than clear because of the lack of consensus as to specification of the amygdala as a structural and functional unit. More often than not there is an overlap with adjacent basal forebrain or temporal lobe areas, precisely the areas mostly associated with cognitive-emotive behavioral processes such as those involved in sensory coding, arousal-attentive activity, memory and learning. Sometimes these are referred as amygdaloidal extensions, a case in point is the posterior limb of the anterior commisure (see AnnNY Acad. Sci (USA), Jun 29, 1999, 877 p645) .Along similar lines investigators from U. of Cambridge, UK report on page 412 of same Journal have clarified that many of the results attributed to amygdala, like activation and reward may be shared by ventral striatum and limbic cortex through their shared dopamine innervation from Nucleus Accumbens. Aversively motivated behavior may be similarly analyzed. From John Hopkins U., Gallagher et al argue that the representational function attributed to the amygdaloidal basolateral complex depends on the latter’s connection with the prefrontal cortex. Similarly they found that the central amygdaloidal nucleus provides an important input into the magno-cellular neurons in the basal forebrain. Both of these activities strongly extend the role of the amygdala to include attention and cognition.

Neuro-developmental Foundations.

            It is very likely that newborn infants will respond to sound combinations (words) characteristic of the mother's language than to those of a foreign tongue according to recent experiments. If we consider the state of sensory development in the unborn fetus (see below), especially in the third trimester of gestation, we can conclude they may already be sensitive to stimuli in the maternal external environment, including linguistic phonemes. Consequently, the combined effects of genetic and epigenetic factors are thus inextricably mingled, from the very earliest stages of embryonic development. The unique combination of the human gene-controlled factors, some of them carried for several million years when interacting with the enormous range of environmental idiosyncrasies, both body-proper internal and external, help account for the uniqueness of each individual.

          As expected, metabolic activity in the newborn, as detected by functional imaging technology (PET, fMRI), is most marked in the sensory-motor cortex and brain stem, areas required for reflex functions. This pattern of metabolic activity  varies in different regions of the brain with different ages. To our surprise it was found that at two to three months, there was substantial metabolic activity in the visual and contiguous parietal cortex which would correspond with the ongoing development of visual-spatial integrative function at such an early phase of development, something we reported as unlikely in a previous publication (Visceral Brain, Noesis 2001). Few months later prefrontal cortex activity was observed and arguably corresponds to the development in the child of higher cortical functions such as interactions with surroundings, stranger anxiety, etc.  We have argued, based on histochemical and psychological data, that audio-visual processing was mostly at sub cortical levels, thalamic geniculate bodies and midbrain colliculi, areas endowed with adequate motor reflex  connections and tonotopic / visual orientation discrimination capabilities. We inferred that the baby was essentially cortically blind. We may have to reconsider some of our previous views but we may also infer that any satellite view of a bridge building construction activity on earth is no guarantee that the bridge is actively being used for transportation purposes yet, it is just being prepared for that future, anticipated activity.

          Another data in support of our model comes from various studies on the microstructure of cognition and distributed representations as they relate to emotion and facial expression patterns. Long before Poggio had modeled his 3-layered, feed-forward network for object recognition, the undersigned had conducted facial electromyographic recordings on medical students under different emotional conditions evoked by the presentation of allusive pictures during the session. We were trying to design a method to detect lies during cross examinations in criminal cases, something more reliable than the polygraph test. We developed consistent ‘signature-patterns’ for the different emotions but had to use a very uncomfortable facial mask dotted with even more uncomfortable pin-prick EMG electrodes to avoid interphase potentials, etc. We abandoned the project without publishing the results but reliable, predictor data collected from the facial distribution of cranial nerves V and VII was obtained, even in the absence of detectable facial expressions. Much later on, in 1992, Etcoff & Magee had published “Categorical perception of facial expressions.”, Cognition 44:227-240. They found that each characterization in facial expressions is tuned in to emotion categories. That was followed by Bishop,CM publication by Oxford Univ. Press in 1995 of “Neural Netwoks in Pattern Recognition”. There is a very active group at UC in San Diego under Dr. G. Cottrell, busy with emotion pattern recognition using ‘Holons’, a neural network program to categorize facial expressions, their results will be published in the next 2002 issue of J. Cognitive Neuroscience.

          One most important piece of information coming from this research, quite relevant to the worth of our 'bps' model of consciousness, is the fact that somehow parallel processing of sensory data impose discontinuous category boundaries even when continuous stimulation is otherwise presented. The continuity of the empirical object of our perception has been processed and stored as discontinued category fragments! This comes as no surprise to us who had always analogized the abstract structure of Kant’s ‘categorical imperatives’ (see Insensible life…, Chapter 1 above & Telicom 1999) as a suitable model to guide and code for perceptual intuitions in the brain, something now called “categorical perception”. To illustrate, we can demonstrate that there are smooth wavelength gradient transitions required in a rainbow pattern formation which are beyond the resolution of our retina, yet what we ‘see’ is a discontinuous rainbow, not warranted by the physical reality out there!

          Another most relevant finding was that sound utterances varying continuosly from a ‘ba’ > ‘pa’ sound are not perceived as a continuum mixture of both but as  two sharply categorized  discontinuous events. Is this the way the baby decodes mother’s baby talk, priming the generative grammar organ (ggo) in the hippocampus into syntactic ordering coding activity? Before this can happen, either during intra-uterine or early post-natal stage, the uttered maternal phonemes had to be processed in the amygdala for their semantic meaning as it related to biological life preservation.

          When confronted with a new unfamiliar situation, a best fit match of the unfamiliar sound with the inherited (implicit memory) audio-visual amygdaloidal gallery was enough to trigger the corresponding humoral, affective disposition, laying the ground work for the operation of a second delayed, recursive examination of the physical context surrounding the stimulus, this requires a comparison with elements coded and stored previously into the hippocampal explicit memory ‘gallery’. Since every good theory should suggest an experiment to validate the premises and anticipate the results we have been concurrently trying to develop a testable model. This has turned out to be a most difficult enterprise, we have asked the mathematical geniuses in France’s Pi Society,  US’ Mega, Ispe, Prometheus and Mensa Societies but the difficult problem remains unsolved. To follow is our tentative approach to the problem which, incidentally, after struggling blind folded for a long time, we found that MIT’s Fodor had already developed a somewhat similar mathematical model we may be able to adopt now with little substantial pruning.

Mathematical Logic Foundations. 

      Our task here is to provide guidelines as to how best to test the premises upon which the thesis for the existence of a “proto-linguistic organ” (anatomically located in the amygdaloidal complex and hippocampus formation) rests. The guidelines are themselves premised on our view on the inseparable link between semantic, biological life- preserving content and syntactic capture and articulation of that meaning into communicable, social life-preserving language particles or symbols able to accurately represent empirical objects / events in nature. We are proposing that the “propositional attitude” concept (Fodor et al) captures both semantic / syntactic elements into logic representations such that operations on them are causally sensitive to the syntactic components, only ones operationally amenable to be logically analyzed. However we will challenge some crucial aspects of this model argue in another chapter ahead. Another important premise, at this stage based on somewhat self-referential experiences, is the observation that the ‘thought process’ may be defined as the emergent phenomena prescribed by the syntactic operations defined over such representations, i.e., thought is a recursive concatenation of relevant ‘propositional attitudes’. Whether this commits us to a concept of thoughts as physical realizations in the brain of the subject, quare! As we have argued elsewhere, thoughts require the capacity for self-consciousness and either one have not been demonstrated to be logically supervenient on the physical brain. But this exception shouldn’t deter us from continuing with our analysis of language as a system of symbolic representations containing a combinatorial syntax layered during late intra-uterine and post-natal life over an inherited combinatorial semantics. The latter represents the genetic memory of the species (implicit amygdaloidal memory), the former represents the social memory (explicit hippocampal memory).

          The equation: <Subject’s (S) attitude (a) is that predicate (P) happens>, e.g., Subject (Angell) hopes (his attitude) ‘that his theory is correct’ (predicate sentence P) or <S a that P>, where S is the subject, (a) is his affective, mental disposition, ‘that P’ is the proposition or action verb. It is most important to notice how the affective mental state is the relation ® that joins subject and predicate, <S ® P>.

          In a very rudimentary, primitive way, nature has genetically encoded for transmission species life-preserving information (attitudes, mental states or qualia) in the amygdala (multimodal & audio-visual data base); this code may be made to be represented, correlated or eventually substituted by sentential particles or phonemes (Mom’s baby talk?). This way primitive past affect or qualic meanings become accesible to sensory inputs for comparison purposes and eventually get regenerated (semantic particles or semanticles) in the developing newborn and linked to an emerging present-but-primitive generative grammar, both of which elements combine linearly (or in parallel) and can now be processed in a Turing biomachine (talking brain?). The operation on these representations generate simultaneously the thought (representation in a ‘mental space’?) and the qualic disposition (feeling of what happens?) pertinent (uniquely causally linked) to the content embodied in the syntax component.

          Before molecular or atomic sentential representations become possible, the baby can register all of the mother’s facial and body expressions in non-linear code and also develop the ability to integrate the various motor, humoral and affective components of an adaptive behavioral response. Eventually these atomic pre-sentential elements are substituted by their molecular sentential equivalents by the generative grammar organ (ggo) and the regenerative semantic organ (rso) contributions, all combined as part of a 'proto-linguistic organ' (plo).

          It is tempting to argue on the persistence of the (rso) atomic particle (phonemic semanticles?) as the elements to be sought for and recognized in future explicit memory recalls. Persons suffering from loss of recent memory still remember the sound profiles of the forgotten word, e.g., may remember 'ecology' sounds from the forgotten word 'gynecology'.  The physical brain code representation now gives its ‘wetware’ brain matter not only ‘meaning’ (semantical properties) but the ability to participate in triggering a conscious state as we argued in the previous chapter on “Concatenation of Different Levels of Cognitive Processing”,  For the first time physical state transitions in a physical brain system is able to preserve non-physical qualia by the linking of the sound representation with the limbic system via the amygdala!! May this explain how the 'non-physical feeling'  may have causal influences (either a syntax structure coding for the semantic content  on the physical brain or vice verse as we argue? We have defined this intermediary interphasic state as the cognitive state. The importance of this aspect of our model is that it provides the decision-making thinker another opportunity to weigh out the relative merits of either the 'affect' or the 'consequences' attending the possible results of the dispositive action contemplated before the option is executed as long as it is remembered that only the execution of the motor event or solution may change, whereas the accompanying qualic justification remains. We feel we need not to explain reductively how a specific inherited phonemic code triggers an invariant neurochemically-controlled, un-explained (unconscious) feeling, it just happens, like our reaction to the unfamiliar rustlings of leaves.

          Many questions remain to be answered, especially about linear coding for proto-linguistic phonemic 'semanticles' non-linear input, like images or Kurzweil patterns, into meaningful  content that carries the sentential equivalents of conditionals, disjunctives, existentials, etc. and other elements vital in judgments or decision making. We suspect that the genetic memory in the amygdala has unique universal semantic representation (sounds, images, etc) coding for an equally unique alphabet of atomic symbolic representations (the constituent particles of a molecular semantics), another research project on its own! As long as we find no alternative to learning other than the classical hypothesis formation and confirmation we have to continue pressing in that direction, but we suspect, as we have argued for the role of the amygdala in the pre-linguistic newborn, that non-linguistic forms of communication exist and we are trying to dig further into that hole!

          The undersigned had a personal experience recently that argues in favor of that suspicion. Two days before boarding the plane for Mensa’s AG in Arizona, the undersigned had a ruptured appendix and spent 3 days inside a ventilator and the remaining 7 in the intensive care unit of a Florida hospital. Since I was helpless to do anything I closed my eyes shut and tried to disconnect with the nightmarish ongoing activity. At one point I tried to voice my concern that “Please, I need air to breath.” and to my surprise I could not hear myself vocalizing that properly articulated sentence even though I could hear other voices, like the experience of 'inner speech'! Then I tried another sentence: “I can’t hear my voice”, with similar results. I was worried sick! Shortly afterwards I could hear some assistant say: “You can’t hear yourself because you have an endotracheal tube in place pressing against your vocal chords, but scribble in this pad anything you need”. I was sedated and didn’t pay much attention until I realized that she had answered a question she couldn’t have possibly heard either!

 I also noticed that as long as I remained focused on my predicament I ‘saw’ shaded geometrical patterns in black and white while my eyes remained closed. When my attention wandered, the structured pattern was substituted by a reverie of zig zagging, chaotic patterns. The non-dominant brain had apparently taken over the communication role now denied to the ‘talking brain’?

Argumentation.

          We are now facing the complex task of synthesizing or integrating the differential representations that individually the amygdaloidal and the hippocampal complex  make of objective reality, how they link the phylogenetic and social past with the ongoing present through the instrumentality of language?.

          Fortunately we are aided in the process by the ample literature there is on “working memory” mechanisms, a successful integration of biological (b), psychic (p) and social (s) life preservation, “implicit memory” and “explicit memory”, genetic and social past facing the contingencies of the evolving immediate present. Resting on the boulders of such excellent accounts on memory, generative grammar, hippocampal and amygdaloidal processing, cognitive science, AI, neuroscience and mathematical logic we are able to identify a protolinguistic organ (plo) that combines the ‘implicit’ (archilayers of Module 1) with the ‘explicit’ (paleolayers of Module 2) memory in the successful generation of a symbolic/sentential communication tool, binding the species in their dual responsibility to ensure biological survival and reproductively perpetuate the human species in the new ecological niche we didn't choose to be born.

Summary and Conclusions.

          We have detailed in previous chapters and elsewhere (“Visceral Brain, Language and Thought”, Noesis 2001) the short* and long** (see below) neural pathways mediating a subject’s response when confronted with a new environmental situation. The short reflex pathway is very similar to the one shown by a newborn when facing the new external environment. The stereotyped reflex response is brief and its role is to alert and prepare the species  for behavioral adaptive responses consonant with the preservation of life, from the startling reflex of the baby to the fear response in the adult. The audiovisual (or multimodal) stimulus perception triggers adaptive, affective, humoral, metabolic and motor responses, all coordinated by the various amygdaloidal lateral and central nuclei as described above all of which are organized to extract information from the external object or event and relate it to primitive biological survival parameters genetically coded as ‘implicit’ memory (see below for a brief synopsis). In the newborn the environmental object / event is substituted by the lactating mother whose every facial expression gets scrutinized for meaning and demeanor as explained above and elsewhere.

          Meanwhile the longer neural pathway in the older baby or adult subject is seeking additional information as may have been previously recorded as ‘explicit’ memory in the hippocampal formation. This ‘second look’ may modify the original meaning attributed to the object / event and thus guide the subsequent response by modifying the original strategy initiated by the amygdala’s response (re-enforcement, withdrawal may actually be coordinated by the hippocampus based on explicit memory content) as has been detailed above. In the newborn this ‘second look’ is not available yet because of the paucity or absence of explicit memories in the hippocampal reservoir. This reservoir starts getting filled with non-linguistic audio-visual code from the mother’s facial expressions and consolidated into more permanent memory form. Non-linear audio-visual code can not be sequenced to extract meaning from operations on its representations by an undeveloped dominant ‘talking brain’.  Cooing and mothers baby talk processing by the baby represent the first primitive attempts to correlate and eventually substitute the non-linear audio-visual code with sound particles progressing eventually to words and short sentences, the 'plo' semanticles guiding Chomsky’s generative grammar organ (ggo) into operation at the hippocampal / pre-frontal locus inside the 'plo'!

          We believe that in that stage the non-linear audio-visual code with its primitive semantic content gets linked with the appropriate primitive syntax particles the latter being modified in the process thus generating a primitive molecular ‘propositional attitude’; we have at present not solved the details of this important linkage. We suspect that a mere glimpse at an audio-visual object / event triggers the recursive selection of the relevant molecular propositional attitude package which makes thought and language production simultaneous. We think the original amygdaloidal semanticles may participate in the binding integration of semantics, syntax and associated qualia. Other components will be added as proficiency in the language is developed, e.g., pragmatics.

We have previously noted how baby talk rehearsing by the baby coincides with his increased awareness that he is not co-extensive with other objects within his visual field, that there is a difference between him and the surroundings, the emergence of a primitive self-consciousness! (See chapter on Language and Thought above & Telicom 2000).

We have demonstrated that proficiency in the use of language requires a capacity for an online, ongoing capacity for self-consciousness, (see chapter on Subconscious Awareness in Language Processing above & Noesis 2000). It is tempting to conclude that self-consciousness, an exclusive human trait we can not even reduce to the physical domain, is intimately tied up with another exclusively human trait, our type of language structure. A robot with a symbolic language that captures all of the salient and essential aspects of human life is a pipe dream ideal for AI engineers that hope will pass the Turing test and have qualia, but dreams are that…, dreams. The irreducible aspects of consciousness will not go away. May linguistics provide some clues?

*Short neural route. The visual retina or the auditive cochlear receptor organs will connect with the medial and lateral geniculate thalamic nuclei. Many (possibly all in the newborn) of the axons that terminate in the lateral geniculate nucleus are branches of axons that go to the superior colliculus, a midbrain center concerned with control of oculomotor head and eye displacements. Those terminating in the medial geniculate body project to the inferior colliculus where they mediate audiomotor head and eye movements. It has been shown that inferior colliculi is capable of tonotopic sound discrimination. Others project to amygdaloid nuclei.

**Long neural route: The visual retina or the auditive cochlear receptor organs will connect with the medial and lateral geniculate thalamic nuclei [24, 25, 26] (auditory and visual respectively) which lie further caudally near pulvinar, and send their efferents to the visual and auditory cortical areas ( in human area 17 or V1 on either side of the calcarine fissure, and auditory cortex in the superior temporal Heschl gyrus) respectively. As we have explained elsewhere, Heschl area will project both to angular gyrus and entorhinal hippocampal formation before accessing Broca’s area.

      At the collicular level important afferent connections have been described. Those areas of the cortex originating connections to higher order nuclei of the thalamus, themselves receive afferents from other sources in the thalamus, so that in this sense the higher order circuits represent a “second run” re-entrant circuitry through the thalamus, a part of a trans-thalamic, cortico-cortical integrating circuit as described by Crick and others before. Further, where the relationships have been studied, the corticothalamic axons that serve as drivers and innervate higher order thalamic nuclei are seen to be branches of long descending motor axons that go to the brain stem or spinal cord.  That is, most of these driver axons that innervate thalamic nuclei, both first and higher order, represent branches of axons that are carrying motor instructions to lower centers. Practically all fibers crossing in either direction between thalamus and cortex, leave collaterals to reticular nuclei. These are glutamatergic, excitatory afferents from branches of the thalamocortical and corticothalamic axons and, in turn, the reticular nuclei send inhibitory, GABAergic axons back to the thalamus [43]. Reticular nuclei also receive afferent innervation from the brain stem. The reticular nuclei denomination is deceptive because it is highly differentiated into functional modules (vision, hearing, somatosensory) or intra-thalamic nuclei (anterior, VA/VL, MD) but within all of these sectors there are topographic maps of cortical surfaces, thalamic maps and/or peripheral sensory maps. In awake behaving animals recordings from thalamic cells show more irregular bursting in relay cells when the experimental animal is not being attentive to the modality or to the region of the sensory field that is under study.  When attention is being paid to the region under study, then the cell is in the tonic mode and a linear version of the input is passed on to the cortex for analysis.  In the awake but inattentive condition, the relay cell in a non-rhythmic burst mode transmits but a poor copy of the input. This arrangement accounts very well for my narrative experience in the ICU above.

End Chapter 12