A BIBLIOGRAPHICAL GUIDE TO THE MIND

Piero Scaruffi

Draft of April 1997

Preface

     This document provides a guided bibliography to  literature  published
     (mainly,  but  not exclusively, over the last two decades) on the sub-
     ject of the mind.  The exponentially growing  interest  on  the  mind,
     consciousness  and life is altering the course of many disciplines and
     opening new fascinating horizons for science. Subjects created by this
     trend  include  Artificial  Intelligence, Artificial Life, Neural Net-
     works, Cognitive Science and Complex Systems.  Physics itself is being
     rewritten,  in  the quest for a grand theory of nature that will unify
     the physical and the psycological sciences.

     Books  reviewed  in  this  bibliography  therefore  span   Philosophy,
     Psychology,  Biology,  Computer  Science, Neurophysiology, Mathematics
     and even Cosmology.

     This document contains an  alphabetical list of  books  with  a  short
     review of their contents.  The reader can use it to decide which books
     to buy for futher information, or just "cut and paste" the information
     for her or his personal research.

     In a sense, this document provides the researcher, or the merely curi-
     ous,  with  the "tools" to work out her or his own theory of the mind.
     At the same it provides everybody with an updated survey of one of the
     most exciting fields of today's science.

     Proceedings of conferences have been generally omitted, but collection
     of  historical  articles  are  included.  Books  that  have  been made
     obsolete by new editions or by new books by the same author  are  gen-
     erally omitted.

     The decision of which books had to be included was largely subjective.
     Recent  books have been given a higher priority, both because of their
     availability and because they are likely to include information  about
     older texts.

     In reviewing a book I have often quoted  liberally  from  the  author.
     Each  review is meant to deliver the main original points of the book.
     It is not meant to be an exaustive review of the entire content.

     The author will gladly receive information  about  books  that  should
     have  been  included  and were not. A future edition will hopefully do
     justice to the ones who were forgotten this time.

     Nobody has time anymore to read all the  interesting  books  that  are
     written  in  the world. This is a humble effort to make it possible to
     be at least aware of their existence.

      Piero Scaruffi
      scaruffi@hpp.stanford.edu

Introduction

     These days something is happening that is likely to have  deep  reper-
     cussions  on  the future of science. A new view of nature is emerging,
     which encompasses both galaxies and  neurons,  gravitation  and  life,
     molecules  and  emotions.  As  a  culmination of centuries of studying
     nature, mankind has been approaching the  thorniest  subject  of  all:
     ourselves.  We  are part of nature, but science leaves us in the back-
     ground, limiting our role to the one of observers.

     For a long time we have enjoyed this priviliged status. But we seem no
     longer  capable  of  eluding  the fundamental issue: that what we have
     been studying for all these centuries is but us, albeit diguised under
     theories of the universe and theories of elementary particles. And now
     it is about time that we focus on the real subject. The  mind  appears
     to  us  as  the  ultimate  and  most refined product of life. And life
     appears to us  as  the  ultimate  and  most  refined  product  of  the
     universe.  Life  and  mind must follow from a logical inference on the
     essence of the universe. If we had the right theory of  the  universe,
     we  would  need no effort in explaining why life happened and what the
     mind is.

     The fact that we don't have yet a good theory of the mind  means  that
     probably  we don't have a good theory of the universe. Therefore, in a
     sense, the science of the mind is doing more than  just  studying  the
     mind: it is indirectly reformulating the whole of science.

     Thanks to progress in all fields, from  Mathematics  to  Neurobiology,
     our  knowledge  has  been  immensely enriched by a wealth of empirical
     data and by a wealth of theoretical  tools.  While  differing  on  the
     specifics,  many  scientists and philosophers feel that mankind is now
     ready for a momentous synthesis.  The main theme of such  a  synthesis
     may  be  that  of  the spontaneous "emergence" in our universe of such
     unlikely properties as life and consciousness.  If we can explain  how
     it developed, we can explain what it is and how it works.

Ageno Mario: LE ORIGINI DELL'UNIVERSO (Boringhieri, 1992)

          Mario Ageno shows that Boltzmann's proof contains two errors:  1.
          Boltzmann's model of a gas represents a discrete set of molecules
          as a continuum of points; 2. Boltzmann  assumes  that  the  walls
          containing  the  closed system are perfectly reflecting. If these
          arbitrary assumptions are dropped, no  rigorous  proof   for  the
          irreversibility of natural processes exists.

Aggleton John: THE AMYGDALA (Wiley-Liss, 1992)

          The book explores various neurobiological aspects of emotion  and
          memory.  Emotions are key to learning and behavior as fear condi-
          tioning imprints emotional memories that are quite permanent. The
          relationship  between  emotion  and  memory goes beyond fear, but
          fear is the emotion that has been studied more extensively. As  a
          matter  of  fact, fear seems to be a common ground for (at least)
          all vertebrates. Memories about fearful experiences  are  created
          by  interactions among the amygdala, the thalamus and the cortex.
          Emotional memory (stored in the amygdala) differs  from  declara-
          tive  memory  (which  is mediated by the hippocampus and the cor-
          tex). Emotional memory is primitive, in the sense that only  con-
          tains  simple  links  between  cues and responses. A noise in the
          middle of the night is enough  to  create  a  state  of  anxiety,
          without  necessarily bringing back to the mind full consciousness
          of what the origin of that noise can be. This actually  increases
          the  efficiency  (at  least the speed) of the emotional response.
          Emotional and declarative memories are stored  and  retrieved  in
          parallel.   Adults  cannot  recall  childhood  traumas because in
          children the hippocampus has not yet  matured  to  the  point  of
          forming conscious memories, but the emotional memory is there.
           Allen James: NATURAL LANGUAGE UNDERSTANDING (Benjamin  Cummings,
          1995)

          The new edition of one of the best textbooks on natural  language
          processing, from basic parsing techniques to anaphora resolution,
          discourse structure to speech acts.

Allen James: READINGS IN PLANNING (Morgan Kaufmann, 1990)

          Allen's temporal logic is based on a many-sorted  predicate  cal-
          culus with variables ranging over "properties", "time intervals",
          "events", etc.  Temproal relations such  as  "during",  "before",
          "overlap",  "meets" and "equal" are primitive, are represented by
          predicates and are controlled by the  axioms  of  the  logic.  An
          instant  is defined as a very small interval.  Properties hold at
          intervals.
           Amari Shun-ichi & Freeman  Walter:  NEURAL  NETWORKS  AND  CHAOS
          (Lawrence Erlbaum, 1994)

          A collection of papers for a workshop on the subject.
           Anderson James  A.  &  Rosenfeld  Edward:  NEURO-COMPUTING  (MIT
          Press, 1988)

          A comprehensive collection of historical papers on brain anatomy,
          cognitive psychology, cybernetics and neural networks.

          William James had a number of powerful intuitions: that the brain
          is  built  to  ensure survival in the world; that cognitive func-
          tions cannot be abstracted from the environment  that  they  deal
          with; that the brain is organized as an associative network; that
          associations are governed by a rule of reinforcement.

          Warren McCulloch's and Walter Pitts' 1943 "A logical calculus  of
          the ideas immanent in the nervous system" is a seminal paper that
          laid down the foundations for the  computational  theory  of  the
          brain.  Their  binary  neuron  can only be in one of two possible
          states, has a fixed threshold below which  it  never  fires,  can
          receive  inputs from either inhibitory synapses and/or excitatory
          synapses, and integrates its input signals at discrete  intervals
          of  time.   If no inhibitory synapse is active and the sum of all
          excitatory synapses is greater than  the  threshold,  the  neuron
          fires.   A  network  of  binary  neurons is fully equivalent to a
          universal Turing machine (i.e., that any finite logical  proposi-
          tion  can be realized by such a network, i.e. every computer pro-
          gram can be implemented as a neural net).

          Featured are the main forefathers  of  today's  neural  architec-
          tures.   Oliver  Selfridge's 1958 "Pandemonium" employs a battery
          of multiple independent units analyze the input, each specialized
          in  a  different  recognition task, so that the input can be pro-
          gressively identified through a number  of  hierarchical  layers,
          each one relying on the conclusions of the lower ones.

          Rosenblatt's 1958 "Perceptron", based on a  non-linear  model  of
          memory,  was  probably  the  first  artificial neural network for
          learning concepts.

          Bernard Widrow's and Marcian Hoff's 1960 "Adaptive switching cir-
          cuits"  yield the ADALINE, a variation on the perceptron based on
          a supervised learning rule, the  "error  correction  rule",  that
          could learn in a faster and more accurate way: synaptic strenghts
          are changed in porportion to the error (what the  output  is  and
          what it should have been) times the input.

          Briefly mentioned are  also  Teuvo  Kohonen's  linear  model  for
          memory  and  Stephen Grossberg's non-linear quantitative descrip-
          tions of brain processes.

          John Hopfield's  1982  "Neural  networks  and  physical  systems"
          developed  a  model  inspired by the "spin glass" material, which
          resembles a one-layer neural network in which weighs are  distri-
          buted  in  a  symmetrical  fashion, the learning rule is hebbian,
          neurons are binary and each neuron is connected  to  every  other
          neuron.  As  they  learn,  Hopfield's nets develop configurations
          that are dynamically stable (or "ultrastable"). Their dynamics is
          dominated  by  a  tendency  towards a very high number of locally
          stable  states  (or  "attractors").   Every  memory  is  a  local
          "minimum" for an energy function similar to potential energy.

          Hopfield's nets exhibit the  ability  to  correct  incomplete  or
          incorrect  information  (because deviations from local minima are
          attracted towards one of  those  minima  and  therefore  canceled
          away).  Compared with the perceptron, a Hopfield net is asynchro-
          nous (which is a more plausible model of the nervous system)  and
          employs backward coupling.  In a later paper (also included here)
          Hopfield replaced the binary neuron with a more plausible neuron.

          More than anything else, Hopfield proved that,  despite  Minsky's
          critique, neural networks are feasible and can even be useful.

          Fukushima's 1983 "Neocognitron" is a multi-layered  network  with
          strong  self-organizing  properties, based on Hubel' and Weisel's
          model of the visual system. A number of modules are triggered  by
          a  retina  of  photoreceptors. Each module has a number of simple
          "S-cells" and more complex "C-cells", driven by "S-cells"  layers
          so that they abstract the features that the "S-cells" pick up.

          In Geoffrey Hinton's and Terrence Sejnowsky's  1985  "A  learning
          algorithm  for  Boltzmann machines" Hopfield's basic architecture
          (binary neuron, energy function  and  so  on)  is  retained,  but
          Hopfield's  learning  rule is replaced with the rule of annealing
          (start off the system at very high "temperature" and then  gradu-
          ally  drop the temperature to zero), which Kirkpatrick and others
          had just proposed as a general-purpose  optimization  rule.   The
          new  model,  Boltzman's  machine,  is more stable than Hopfield's
          model as it will always end  in  a  global  minimum  (the  lowest
          energy state).

          David Rumelhart's and Geoffrey Hinton's "back-propagation"  algo-
          rithm,  originally proposed in 1986, considerably faster than the
          Boltzmann machine, quickly became the most popular learning  rule
          for  multi-layered  netowrks.  The  generalized  "delta rule" was
          basically an adaptation of the Widrow-Hoff error correction  rule
          to  the  case  of multi-layered networks by moving backwards from
          the output layer to the input layer. This was also the definitive
          answer to Minsky's critique, as it proved to be able to solve all
          of the unsolved problems.

Anderson James A.: NEURO-COMPUTING 2 (MIT Press, 1990)

          Another set of historical articles, including seminal  papers  on
          Caianiello's  neural  equations,  Wiener's cybernetics, Pribram's
          holographic model, Minsky's critique of perceptrons  and  Fodor's
          And  Pylyshyn's "Connectionism and cognitive architecture" on the
          feasibility of a compositional theory.
           Anderson John Robert: THE  ARCHITECTURE  OF  COGNITION  (Harvard
          Univ Press, 1983)

          ACT, as developed in 1976, was a cognitive  architecture  capable
          of dealing with both declarative knowledge (represented by propo-
          sitional networks) and procedural knowledge (represented by  pro-
          duction  rules).  The production system worked as the interpreter
          of the propositional network.

          New production rules are learned as the  system  works.   Complex
          cognitive skills can develop from a simple architecture.

          ACT assumes that a cognitive system has two short-term  memories:
          a declarative memory (that remembers experience) and a productive
          memory (that remember rules learned from experience).   Knowledge
          is compiled into more and more complex procedural chunks  through
          an incremental process of transformation of declarative knowledge
          in  procedurale  knowledge.   An incremental process transforming
          declarative  knowledge  into  procedural  knowledge  consolidates
          knowledge  into ever more complex procedural chunks. Each rule is
          weighed according to how often it is used, and the weight  deter-
          mines its priority.
           Anderson  John  Robert:  THE  ADAPTIVE  CHARACTER   OF   THOUGHT
          (Lawrence Erlbaum, 1990)

          The book explores the cognitive architecture known as ACT,  which
          broadens the principles of production systems.

          Anderson has developed a  probabilistic  method  to  explain  how
          categories  are  built  and how prototypes are chosen. Anderson's
          model maximizes the inferential potential  of  categories  (i.e.,
          their  "usefulness"):  the  more  a  category  helps  predict the
          features of an object, the more the existence  of  that  category
          makes  sense.  For each new object, Anderson's model computes the
          probability  that  the  object  belongs  to  one  of  the   known
          categories and the probability that it belongs to a new category:
          if the latter is greater than  the  former,  a  new  category  is
          created.

Anderson John Robert: RULES OF THE MIND (Lawrence Erlbaum, 1993)

          In this book Anderson looks for  the  psychological  evidence  of
          production  systems  (in particular in the area of acquisition of
          cognitive skills) and refines ACT into ACT-R,  which  includes  a
          neural-network implementation of a production system. The book is
          structured as a set of articles by Anderson and  others,  and  it
          includes simulation software.
           Anderson James: AN INTRODUCTION TO NEURAL NETWORKS  (MIT  Press,
          1995)

          A very up-to-date 600-page survey of the mathematical foundations
          of neural networks that neatly organizes linear associators, per-
          ceptrons, gradient descent  algorithms  (ADALINE,  back  propaga-
          tion),  nearest  neighbor  models,  Kanerva's  sparse distributed
          memories,  energy-based   models   (Hopfield   model,   Boltzmann
          machine),  Kohonen's adaptive maps, the BSB model, etc. The sonar
          system of the bat is also reviewed.
           Anderson Norman: A FUNCTIONAL THEORY OF COGNITION (Lawrence Erl-
          baum, 1996)

          Anderson presents a unified theory of functional cognition,  i.e.
          a  cognitive theory of everyday life, centered around the founda-
          tion axiom of purposiveness.

Aoun Joseph: A GRAMMAR OF ANAPHORA (MIT Press, 1986)

          Aoun deals with reciprocals and reflexives by  proposing  a  gen-
          eralized  government-binding  theory  that  leads to a structural
          unification of the notions of pronouns, empty categories and ana-
          phors.
           Arbib Michael: THE HANDBOOK OF BRAIN THEORY AND NEURAL  NETWORKS
          (MIT Press, 1995)

          This 1,000-page handbook (compiled by dozens of experts under the
          direction of Michael Arbib) covers topics in Psychology, Philoso-
          phy, Neurophysiology,  Artificial  Intelligence,  self-organizing
          systems, neural networks, etc.

Arbib Michael: METAPHORICAL BRAIN (Wiley, 1972)

          This introduction to cybernetics begins with dividing  simulation
          and  emulation  approaches to modeling intelligent behavior, i.e.
          artificial  intelligence  and  neural  networks.  Then  the  book
          focuses  on  brain  theory, considering the brain as a particular
          type of machine.
           Arbib Michael: CONSTRUCTION  OF  REALITY  (Cambridge  University
          Press, 1986)

          The mind constructs reality  through  a  network  of  schemas.  A
          schema is both a mental representation of the world and a process
          that determines action in the world.  Arbib's theory  of  schemas
          is  based  on  Pierce's notion of a "habit" (a set of operational
          rules that, by exhibiting both stability and adaptability,  lends
          itselft  to  an  evolutionary  process)  and Piaget's notion of a
          "scheme" (the generalizable characteristics  of  an  action  that
          allow the application of the same action to a different context).
          Both assume that schemas are compounded  as  they  are  built  to
          yield successive levels of a cognitive hierarchy.  Categories are
          not innate, they are constructed through the individual's experi-
          ence.  What is innate is the process that underlies the construc-
          tion of categories (this is similar  to  Chomsky's  view  of  the
          rules of language).

          The theory of schemas is consistent with a model of the brain  as
          an evolving self-configuring system of interconnected units.

          The construction of reality is also guided by social conventions,
          as  the  formation  of new schemas is sometimes a social process.
          Language arises from  such  a  process.  Arbib  argues  that  all
          language  is  metaphorical  and  bases  its theory of language on
          Black's interaction theory of metaphor: to understand the meaning
          of  a  sentence  is not only to be able to identify its referent,
          but also to call to mind all the schemas associated to it.  Meta-
          phor is a necessary ingredient of any symbolic system.

          The theory is applied to a wealth of  issues  in  psychoanalysis,
          hermeneutics, epistemology and even theology.
           Arbib Michael: FROM  SCHEMA  THEORY  TO  LANGUAGE  (Oxford  Univ
          Press, 1987)

          A theory of language based on Arbib's theory of schemas,  with  a
          practical implementation.
           Arbib Michael: BRAINS MACHINES AND MATHEMATICS (Springer Verlag,
          1987)

          An introduction to some topics of cybernetics,  neural  networks,
          Turing  machines, self-reproducing automata and to Godel's incom-
          pleteness theorem.

Arbib Michael: METAPHORICAL BRAIN 2 (Wiley, 1989)

          The second volume greatly  expands  the  contents  of  the  first
          volume.  Besides a little neuroanatomy, the focus is on mathemat-
          ical  analyses  of  neural  phenomena  from  the  perspective  of
          action-oriented perception and in the light of Arbib's own theory
          of schemas. Schema theory is applied to the vision  of  the  frog
          and  high-level  recognition, hand control and speech understand-
          ing. Along the way, mathematical models are  offered  to  explain
          locomotion  and  eye  movement;  and all the main learning models
          (from perceptrons to the HEARSAY system, from  Hopfield  nets  to
          Boltzmann  machines,  from backpropagation to the NETTALK system)
          are formally introduced.

          Arbib  advances  a  theory  of  consciousness:   first   language
          developed,  as  a  tool  to communicate with other members of the
          group in order to coordinate group  action;  then   communication
          evolved  beyond the individual-to-individual sphere into the self
          sphere.
           Armstrong David Malet: BELIEF, TRUTH  AND  KNOWLEDGE  (Camrbidge
          University Press, 1973)

          Beliefs are maps of the  world  (with  the  believer  as  central
          reference)  by  which the believer's actions are guided.  Beliefs
          are states that have an internal structure: the  content  of  the
          proposition  believed.  Beliefs may be reduced to the deep struc-
          tures of Chomsky's linguistic  theory.   Beliefs  often  come  in
          degrees:  a  partial belief is a degree of causal efficacy of the
          belief state in relation to action.
           Armstrong David Malet: THE NATURE OF MIND (Cornell  Univ  Press,
          1981)

          A philosophical treaty on the dualism of  the  mind,  which  also
          presents  Armostrong's  causal theory of the mind.  Mental states
          and physical states are identical (just  like  we  perceive  many
          natural  phenomena  without  perceiving the corresponding micros-
          copic physical processes) and a mental  state  is  causally  con-
          nected with a physical state.  A state of the brain causes a men-
          tal state.  Consciousness of a mental state is  a  perception  of
          that mental state.

          Consciousness is the perception of  mental  states.  Its  special
          status  is  purely  illusory.  The  self is the single continuing
          entity that appears from the organization of  introspection.  The
          biological  function of consciousness is to sophisticate the men-
          tal processes so that they yield more interesting action.
           Ashby William: AN INTRODUCTION TO CYBERNETICS (Chapman  &  Hall,
          1956)

          In this book Ashby summarized a number of  influential  concepts.
          He  placed  emphasis  on  feedback,  the  process that allows for
          "homeostasis".  Both machines and living beings tend to change to
          compensate  variations  in  the environment, so that the combined
          system is stable. For living beings this translates into "adapta-
          tion"  to  the  environment.   The  "functioning"  of both living
          beings and machines depends on feedback  processes.   Ashby  also
          emphasized  the power of self-organizing systems, systems made of
          a very high number of simple units which can evolve  autonomously
          and adapt to the environment by virtue of their structure.

          In 1962 Ashby also formulated his principle of self-organization:
          "in   any   isolated  system  life  and  intelligence  inevitably
          develop". In every isolated system  subject  to  constant  forces
          "organisms"  arise that are capable of adapting to their environ-
          ment.
           Ashmore Richard & Jussim Lee: SELF  AND  IDENTITY  (Oxford  Univ
          Press, 1997)

          A series of article on the relationship between the self and  the
          idea of identity.
           Ashtekar Abbay: CONCEPTUAL PROBLEMS  OF  QUANTUM  GRAVITY  (Bir-
          khauser, 1991)

          Ashtekar is a proponent of the loop-space theory of quantum grav-
          ity.  To quantize gravity physicists only need to show that grav-
          itational waves consist of quantum force-carrying  particles,  or
          gravitons.   The perturbation methods that have been developed to
          this purpose (and which gave rise to the theory of  superstrings,
          infinitesimal  loops  of  energy whose wrigglings should generate
          particles and forces)  have  largely  failed  because  gravitons,
          unlike other force carriers, alter the very geometry of space and
          time, which in turn  affects  their  behavior;  in  other  words,
          because  of  gravity's  inherently  self-referential,  non-linear
          nature.

          By using Amithaba Sen's variable, time and space can be split  in
          two  distinct  entities  subject to quantum uncertainty just like
          position and momentum.  AShetkar's equations generate exact solu-
          tions for quantum gravitational states that can be represented by
          loops  (as  in  knot  theory).  The  loops  are  tightly  knitted
          together. Gravitons are embroidery knitted into the loops.
           Austin John Langshaw: HOW TO DO THINGS WITH WORDS  (Oxford  Univ
          Press, 1962)

          Austin handles language as a particular case of  action,  "speech
          action".

          Austin introduced a tripartite classification of  acts  performed
          when  a  person  speaks.   Each utterance entails three different
          categories of speech acts: a locutionary act (the words  employed
          to  deliver  the  utterance),  an  illocutionary act (the type of
          action that it performs, such as warning, commanding,  promising,
          asking), and a perlocutionary act (the effect that the act has on
          the listener, such as believing or answering).

          A locutionary act is the act of producing a meaningful linguistic
          sentence.  An illocutionary act sheds light on why the speaker is
          uttering that meaningful linguistic  sentence.  A  perlocutionary
          act is performed only if the speaker's strategy succeeds.

          Austin believes that  any  locutionary  act  (phonetic  act  plus
          phatic  act plus rhetic act) is part of a discourse which bestows
          an illocutionary force on it. All language is therefore an  illo-
          cutionary act.
           Austin John Langshaw: SENSE AND SENSIBILIA (Clarendon, 1962)
          We cannot directly perceive material  objects,  but  only  sense-
          data.

Austin John Langshaw: PHILOSOPHICAL PAPERS (Clarendon, 1961)

          A collection of all the philosophical papers of  the  philosopher
          famous  for  his theory of truth as grounded in historical situa-
          tions: "a statement is true when the historic state of affairs to
          which it is correlated by the "demonstrative" conventions is of a
          type with which the sentence used in making it is  correlated  by
          the "descriptive" conventions.  Descriptive conventions correlate
          sentences with  types  of  situation.  Demonstrative  conventions
          correlate statements with historic situations.
           Baars Bernard: A COGNITIVE THEORY  OF  CONSCIOUSNESS  (Cambridge
          Univ Press, 1993)

          Conscious experience is distributed widely throughout the nervous
          system.   Any  conscious  experience emerges from cooperation and
          competition between the many processing units of the brain  work-
          ing  in  parallel  and  occurs within a background of unconscious
          contexts. The self is the  dominant,  enduring  context  of  many
          conscious experiences. Baars brings forth a wealth of psychologi-
          cal and neurophysiological data to justify his views.   The  ner-
          vous  system  can  be  viewed as a set of independent intelligent
          agents which broadcast messages to the  other  agents  through  a
          common  workspace  (just  as if they were writing on a blackboard
          visible to every other agent).  That workspace is consciousness.

          He starts by proving that the mind  contains  unconscious  mental
          representations,   such   as  episodic  memories  and  linguistic
          knowledge; that  the  mind  originates  from  the  work  of  many
          independent,  specialized  "processors",  i.e.  skills  that have
          become  highly  practiced,  automatic  and   unconscious.   Baars
          emphasizes the striking differences between conscious and uncons-
          cious processes: unconscious processes are  much  more  effective
          (e.g.,  we parse sentences unconsciously all the time, but cannot
          consciously define how we parse them), they operate  in  parallel
          (whereas  we  can  only  have one conscious process at the time),
          they  appear  to  have  almost  unlimited   capacity   (conscious
          processes have very limited capacity).

          Contexts are created by a dual process of searching for  informa-
          tion  and  adaptation  to information, the former leading to more
          conscious access, the latter reducing  conscious  access  (things
          become  habitual  and automatic).  Baars emphasizes the relation-
          ship between information and  consciousness  (perceptual  systems
          are more sensitive to information than energy, redundant informa-
          tion fades from consciousness).  Conscious experience is informa-
          tive  and  triggers  widespread  adaptive  processes.   Conscious
          experience is the product of biological adaptation.
           Baars Bernard: IN THE  THEATRE  OF  CONSCIOUSNESS  (Oxford  Univ
          Press, 1996)

          Baars conceives consciousness as a theatrical stage for  emotion,
          perception and thoughts.
           Back Thomas, Fogel David &  Michalewicz  Zbigniew:  HANDBOOK  OF
          EVOLUTIONARY COMPUTATION (Oxford Univ Press, 1997)

          The ultimate handbook for professional genetic algorithm users.
           Bach Emmon: UNIVERSALS IN LINGUISTIC THEORY  (Holt,  Rinehart  &
          Winston, 1968)

          A collection of four essays on linguistics, the longest one being
          Charles Fillmore's seminal "The case for cases".

          Fillmore's grammar assumes that each sentence  represents  expli-
          citly  the relationships between concepts and action. A universal
          underlying set of caselike relations play a key role in determin-
          ing  syntactic  and  semantic relations in all languages.  A sen-
          tence is represented by identifying  its  "cases",  analogous  to
          noun  cases.  Sentences  that  deliver the same meaning with dif-
          ferent words are therefore represented in the same way.

Bach Emmon: CATEGORIAL GRAMMARS (Reidel, 1988)

          A collection of essays on what  Yehoshua  Bar-Hillel  defined  in
          1960  as categorial grammar, that provide an excellent historical
          introduction to the field.

          In contrast to linguistic  analyses  based  on  phrase  structure
          grammars,  in  a  categorial  grammar  every  item  of a language
          belongs to one or more categories; a category can be either basic
          or  derived;  derived categories are defined in terms of basic or
          derived categories in a compositional way.  Expressions belonging
          to  derived  categories  may be identified with funtions that map
          expressions of  one  constituent  category  into  expressions  of
          another constituent category.

          Categorial grammars adhere to three principles: language is  seen
          in  terms  of  functions  and  arguments  rather than constituent
          structure (dependency grammar rather than phrase-structure  gram-
          mar); a tight correspondence is imposed between syntax and seman-
          tics such that every rule of syntax is also a rule  of  semantics
          (the  rule-to-rule hypothesis); monotonicity is always favored at
          the expense of destructive devices which characterize transforma-
          tional grammars.

          Categorial grammars are based on the algebraic notions  of  func-
          tion and argument and can therefore be represented using Church's
          lambda  operator.   The  Lambek  calculus  was  the  first  major
          mathematical tool for the field.

          Categorial grammars involve  semantic  categories,  in  agreement
          with   Edmund   Husserl's   meaning   categories   and  Stanislaw
          Lesniewski's and Kazimierz Ajdukiewicz's logics.

          Bach has improved the original model by  allowing  categories  to
          have  internal  structures  that  define  the  features  that are
          relevant  to  determine   lexical   and   syntactic   properties.
          Categories can then be viewed as clusters of features.

Bach Emmon: SYNTACTIC THEORY (Holt, Rinehart & Winston, 1974)

          An in-depth treatment of transformational grammars for  linguists
          that  summarizes  the  progress  made  in the early Seventies and
          updates Bach's earlier "Introduction  to  Transformational  Gram-
          mars".  It  contains a long introduction to Chomsky's "Aspects of
          the Theory of Syntax".

Baddeley Alan: WORKING MEMORY (Clarendon Press, 1986)

          Baddeley developed a theory of work memory based on three subsys-
          tems:  a central control (for residual ignorance) and two passive
          storage systems, a speech system and a visual system.

Baddeley Alan: YOUR MEMORY (MacMillan, 1982)

          An introduction to the functioning and structure  of  memory  for
          the broad audience. Baddeley assumes the existence of three types
          of memory: long-term (both episodic and semantic), short-term and
          sensory memory.

Baddeley Alan: ATTENTION (Oxford University Press, 1994)

          A tribute to Donald Broadbent in the  form  of  a  collection  of
          essays on his contributions to various cognitive tasks.

Baddeley Alan: HUMAN MEMORY (Simon & Schuster, 1990)

An introduction to the theories of memory for the broad audience.

Ballard Dana: COMPUTER VISION (Prentice Hall, 1982)

          This monumental book describes a detailed computational model  of
          how physical objects can be constructed from images.
           Baltes Paul: LIFE-SPAN DEVELOPMENT AND BEHAVIOR (Academic Press,
          1984)

          Baltes' theory of dual processes  assumes  that  intelligence  as
          information  processing  is  universal  and  biological,  whereas
          intelligence as knowledge pragmatics is acquired through  experi-
          ence and therefore influenced by cultural factors.
           Bar-Hillel Yehoshuas: LANGUAGE AND INFORMATION (Addison  Wesley,
          1964)

          By  building   on   Lesniewski's   and   Ajdukiewicz's   semantic
          categories,  Bar-Hillel  defined  a  variant  of phrase structure
          grammar that he called categorial grammar in  which  "every  sen-
          tence  is the result of the operation of one continous part of it
          upon the remainder, these two parts being the  immediate  consti-
          tuents  of  the  sentence, such that these constituents are again
          the product of the operation of some  continous  part  upone  the
          remainder, etc".
           Barkow Jerome, Cosmites  Leda,  Tooby  John:  THE  ADAPTED  MIND
          (Oxford Univ Press, 1992)

          A collection of articles on evolutionary  psychology  (i.e.,  the
          evolution of the mind).

          The mind consists of  specialized  modules  designed  by  natural
          selection  to  solve  problems in the environment that have to do
          with survival and reproduction.

          Social darwinism is the evolution of Wilson's sociobiology
           Barr Avron & Feigenbaum Ed: HANDBOOK OF ARTIFICIAL  INTELLIGENCE
          (William Kaufmann, 1982)

          A monumental catalog of models and techniques  for  A.I.  profes-
          sionals and researchers.

Barsalou Lawrence: COGNITIVE PSYCHOLOGY (Lawrence Erlbaum, 1992)

An introduction to the field.

           Bartlett Frederic Charles: REMEMBERING  (Cambridge  Univ  Press,
          1967)

          In 1932 Bartlett developed one of the earliest  cognitive  models
          of  memory.  Bartlett  noted  how  memory cannot remember all the
          details, but can "reconstruct" the essence  of  a  scene.  Events
          cannot be stored faithfully, but must have been summarized into a
          different form, a "schema". Individuals do not  passively  record
          stories verbatim, but rather actively code them in terms of sche-
          mas, and then can recount the stories by retranslating the  sche-
          mas into words.

          Each new memory is categorized in a schema which depends  on  the
          already  existing  schemas.   In  practice, only what is strictly
          necessary  is  added.  When  a  memory  must  be  retrieved,  the
          corresponding  schema  provides  with instructions to reconstruct
          it. Bartlett notes how much easier it is to recognize  an  object
          in its typical environment.

          That is why recognizing an object is much easier in  its  typical
          context than in an unusual context.
           Barwise John & Perry John: SITUATIONS AND ATTITUDES (MIT  Press,
          1983)

          Inspired by Gibson's ecological realism, Barwise proceeds to undo
          Frege's  theory  of meaning (that meaning is located in the world
          of sense).  The world is full of  meaning  and  information  that
          living organisms can use.

          Meaning is not an exclusive  of  language,  it  is  pervasive  in
          nature  ("smoke  means fire"). Meaning involves the informational
          content of situations and arises from regularities in the  world.
          Reality  is  made  of situations. Sentences stand for situations.
          The semantic value of a sentence is a set of abstract situations.
          Meaning arises out of recurring relations between situations.

          Barwise's formalism employs  Kleene's  partial  functions  (which
          deal with finite amounts of information).

          Reality comes in situations. Situations are made of  objects  and
          spatio-temporal  locations;  objects have properties and stand in
          relations.  Therefore, a situation is described by a set of rela-
          tions between objects.

          A situation-type is a partial relation from n-ary relations and n
          individuals  to the values true and false.  A course of events is
          a partial function from locations to situation-types.   Therefore
          a  course of events at a location on which it is defined yields a
          situation-type.  A state of affairs is a course of  events  which
          is defined on just one location.

          A living organism (a part of reality capable  of  perception  and
          action)  must be able to cope with the ever new situations of its
          course of events and to anticipate the future course  of  events.
          It  must  be able to pick up information about one situation from
          another situation.  This can be realized by identifying similari-
          ties  between situations and relations between such similarities.
          Each organism performs this process of breaking down reality in a
          different  way,  as  each  organism "sees" reality in a different
          way, based on its ecological needs.

          The type of a situation is determined by  the  regularities  that
          the  situation  exhibits.   Regularities are invariants differen-
          tiated by organism, acquired by adaptation  to  the  environment,
          that  define  its behavior in the environment. These similarities
          between various situations make it possible for  an  organism  to
          make  sense of the world. At the same time they are understood by
          all members of the same specie, by a  whole  "linguistic  commun-
          ity".

          Formally, one situation can  contain  information  about  another
          situation  only  if there is a relation that holds between situa-
          tions sharing similarities with the former situation  and  situa-
          tions  sharing  similarities  with the latter situation.  In that
          case the first situation "means" the  second.   A  meaning  is  a
          relation  between  different types of situations.  In situational
          semantics the meaning of a declarative  sentence  is  a  relation
          between utterances and described situations.

          Therefore, constraints between types of situations are actual and
          yield  meaning.   Meaning  is defined as relations that allow one
          situation to contain information about another situation.

          Situational semantics solves the semantic problems  of  granular-
          ity, context and factorization by expressing properties and rela-
          tions as primitive entities. By assuming that sentences stand for
          situations,  it avoids all the pitfalls of the logical tradition,
          for which sentences stand for truth values.

          Situations are more flexible than possible  worlds  because  they
          don't need to be coherent and don't need to be maximal. Just like
          mental states.

          Indexicals are held to  represent not only a few  isolated  words
          such  as  "I"  and  "now"  but  the  way the speaker exploits the
          discourse context. They play a key role in the way language  con-
          veys information.

          Propositional attitudes report relations to situations.

          The book also contains a theory of knowledge and beliefs that  is
          similar  to  Dretske's.  An agent knows that p if the agent has a
          belief that p and that belief carries the information that p.

Barwise Jon: THE SITUATION IN LOGIC (Cambridge Univ Press, 1988)

          A collection of a few historical papers by Barwise  on  situation
          theory  and  situation semantics, including philosophical discus-
          sions, replies to criticism and introduction to the  mathematical
          rudiments.  Barwise also extendes and refines a few of his origi-
          nal concepts.

          Logic should be studied  from  the  perspective  of  information,
          information  processing  and  information  communication. Barwise
          emphasizes the relational nature of information (e.g., perception
          is  a  relation  between  perceiver  and  perceived) and the cir-
          cumstantial nature of  information  (information  is  information
          about the world).

          Situation semantics emphasizes two related phenomena:  efficiency
          of  language  and  partiality of information. Situation semantics
          offers a relation theory of meaning: the meaning  of  a  sentence
          provides  a  constraint  between  the utterance and the described
          situation.

Bechtel William: PHILOSOPHY OF MIND (Lawrence Erlbaum, 1988)

          A broad and accessible survey of various schools of philosophy of
          mind.   The  book  is  organized  around  three topics: language,
          intentionality and mind-body problem. As far as language goes  it
          covers  referential  analysis of meaning (Frege, Russell), speech
          act theory (Austin, Searle, Grice), holistic analysis of  meaning
          (Quine, Davidson), Kripke's possible world semantics and Putnam's
          causal theory of reference.  The chapters on intentionality  deal
          with  the  computational  theory  of mind, cybernetics, Dennett's
          intentional stance.  The mind-body  problem  is  summarized  from
          Descartes' dualism to behaviorism, identity theories, eliminative
          materialism and functionalism.

          A survey of ancient and modern theories of the mind.

Bechtel William: PHILOSOPHY OF SCIENCE (Lawrence Erlbaum, 1988)

          An introduction to logical positivism and  most  recent  theories
          (Kuhn, Feyerabend, Lakatos).

          A survey of modern theories of science.
           Bechtel William & Adele Abrahamsen: CONNECTIONISM AND  THE  MIND
          (MIT Press, 1991)

          Drawing from James McClelland, David Rumelhart and Geoffrey  Hin-
          ton,  the  book provides a primer to connectionist networks, with
          examples on connectionist simulations of language and  reasoning.
          The book includes a lengthy defense of connectionism against cri-
          ticism and a survey of the impact of connectionism on other  dis-
          ciplines.

Behe Michael: DARWIN'S BLACK BOX (Free Press, 1996)

          Behe is skeptical about  Darwin's  theory  of  evolution  because
          cells  are too complex to have evolved spontaneously. Most cellu-
          lar systems are "irreducibly complex", i.e. they could  not  work
          without  some  of  their parts. If one of the parts is not there,
          the system does not operate, and therefore cannot  reproduce  and
          evolve.  Such  systems  cannot be built by gradual evolution: too
          many of their parts must be there in order for them to be able to
          start  evolving.  Their  structure  cannot  be  due  to evolution
          because their function cannot be built incrementally.  For  exam-
          ple,  a  mousetrap  is  not  a mousetrap until it has a spring: a
          mousetrap with the spring cannot evolve from a mousetrap  without
          a  spring  because  the  latter would have no function, therefore
          would simply not reproduce. Organisms are even more complex  than
          mousetraps: they require sophisticated mechanisms for storing and
          transporting enzymes and proteins, among other things.  The  cell
          is  too complicated, and it needs to be that complicated in order
          to be a living cell, and therefore it cannot  have  evolved  from
          something  that  was  less  complicated. Behe concludes that life
          must have been designed by an intelligent agent.

Berwick Robert: PRINCIPLE-BASED PARSING (Kluwer Academic, 1991)

          A collection of articles in principle-based parsing a  small  set
          of  fundamental principles is used to derive sentence types (such
          as passive). The principles interactive deductively to  construct
          sentence  types.  Parsers  are  highly specialized inference pro-
          cedures.

Bickerton Derek: LANGUAGE AND SPECIES (Chicago Univ Press, 1992)

          Bickerton thinks that language  was  invented  to  represent  the
          world  and guesses what long series of evolutionary events helped
          develop that faculty.

          Language is sufficient to account for the rationality and  intel-
          ligence  of  humans.  Language  created the human species and the
          world that humans see.  Language is  a  biological  feature  that
          arises  from  the  genetic code.  Language was created during the
          evolutionary path by a change in neural organization

          Syntax is the fulcrum of language.
           Bischof Horst:  PYRAMIDAL  NEURAL  NETWORKS  (Lawrence  Erlbaum,
          1995)

          Bischof thinks that the  complex  task  of  vision  is  performed
          effortlessly  by the brain because of a massive use of hierarchi-
          cal structures.

Black Max: MODELS AND METAPHORS (Cornell Univ Press, 1962)

          Black's interaction theory of metaphor views the  metaphor  as  a
          means  to  reorganize the properties of the destination.  A meta-
          phor is not an isolated term, but a  sentence.  The  metaphorical
          sentence, or "frame", contains the words that are used metaphori-
          cally, or the "focus". A metaphor involves two subjects, and  one
          of them, the secondary subject, comes with a system of associated
          stereotyped information which can be used  as  a  filter  on  the
          principal subject.  Therefore, there is a tension between the two
          subjects of a metaphor, each subject is a system and the metaphor
          consists in a transaction between the two systems.

          A metaphor does not express similarities: it creates  similarity.
          Metaphors  are based on similitude, not analogy. Metaphors act on
          the organization of the lexicon and the model the  world.   Meta-
          phorizing is related to categorizing (the choice of a category in
          which to place an object is a choice of perspective), but is dis-
          tinguished  from  it  by an incongruity which causes a reordering
          and a new perspective.

          Language is dynamic: what is literal may  become  metaphoric  and
          viceversa.
           Block Ned: READINGS IN PHILOSOPHY OF  PSYCHOLOGY  (Harvard  Univ
          Press, 1980)

          A collection of articles on behaviorism (Putnam,  Skinner,  Chom-
          sky), physicalism (Davidson, Fodor, Putnam, Kripke, Nagel), func-
          tionalism  (Armstrong,  Nagel,  Lewis,   Putnam,   Kim),   mental
          representations   (Fodor,   Dennett),  imagery  (Dennett,  Fodor,
          Kosslyn, Pylyshyn), linguistics (Stich, Chomsky, Fodor, Katz)

          Block offers his own critique of functionalism and his own theory
          of the mind.

          The psychological state of a person can be  identified  with  the
          physical  process  that  is taking place in the brain rather than
          the state in which the brain is.  The psychological state can  be
          represented  as  the  operation performed on a machine, i.e. with
          the computational state of the machine.  The psychological  state
          does  not  depend on the physical state of the machine and can be
          the same for different machines that are  in  different  physical
          states.
          Qualias (sensations that are associated to the fact of being in a
          given  psychological  state)  are not easily explained in a func-
          tionalist view.  An organism whose functional states are  identi-
          cal to ours, but in which pain causes the sensation that we asso-
          ciate to pleasure (inverted qualia), and an organism whose  func-
          tional  states are identical to ours, but in which pain causes no
          sensation  (absent  qualia).  Functionalism  cannot  account  for
          either case.

          Functionalism does not prescribe how we can limit the universe of
          organisms  who  have  mental  states. A functionalist might think
          that Bolivia's economy, as expertly manipulated by  a  financier,
          has mental states.  Class identity requires also identical inter-
          nal processes, but this way it excludes beings that we  might  be
          tempted to consider having mental states, such as an extraterres-
          trial being  who  behaves  like  us  but  is  made  of  different
          material.

Block Ned: IMAGERY (MIT Press, 1981)

          A collection of articles on mental imagery, including results  of
          psychological  experiments and philosophical theories. The start-
          ing point for the debate is that scientists have  found  no  pic-
          tures  or  images  in the brain, no internal eye to view pictures
          stored in memory and no means to manipulate them.  Either (Fodor,
          Kosslyn) the brain has mental pictures that somehow represent the
          real-world images, or (Dennett, Pylyshyn)  the  brain  represents
          images  through a non-imagistic system, namely language, i.e. all
          mental representations are descriptional.
           Bobrow Daniel: QUALITATIVE REASONING ABOUT PHYSICAL SYSTEMS (MIT
          Press, 1985)

          This is the first volume  (a  special  issue  of  the  Artificial
          Intelligence Journal) that brought together the main names in the
          then still young discipline of qualitative  reasoning.  They  all
          share  the aim of explaining a physical system's behavior through
          something closer to common sense than Physics' dynamic equations.
          They  conceive a physical system as made of parts that contribute
          to the overall behavior  through  local  interactions.  They  all
          employ  some  variation  of  Hayes'  measure  space  (a  discrete
          representation of a continous space that only deals with the sig-
          nificant values that determine boundaries of behavior).

          The main difference in the way they model a system  is  in  their
          ontologies:  Kuipers  adopts  qualitative constraints among state
          variables; DeKleer focuses on  the  devices  (pipes,  valves  and
          springs)  connected  in  a  network of constraintsl; Forbus deals
          with processes by extending Hayes' notion of history.  The system
          behavior is almost always described by constraint propagation.

          Johan DeKleer describes a phenomenon in a discrete measure  space
          through  "qualitative  differential equations", or "confluences".
          An envisionment is the set of all possible future behaviors.

          Ken Forbus defines a quantity space as a partially ordered set of
          numbers.   Common  sense is interested in knowing that quantities
          "increase" and "decrease" rather than on formulas yielding  their
          values in time.

          Benjamin Kuipers formalizes the problem as a sequence  of  formal
          descriptions;   from   the   structural  description  derive  the
          behavioral description ("envisionment") and from this derive  the
          functional description.  In his quantity space, besides the signs
          of the derivatives, what matters most are critical or "landmarks"
          values,  such as the temperature at which water undergoes a phase
          transition.

          The other papers mainly cover practical applications.
           Bobrow  Daniel:  ARTIFICIAL  INTELLIGENCE  IN  PERSPECTIVE  (MIT
          Press, 1994)

          An excellent selection of the articles (originally  published  by
          the  Journal  of  Artificial  Intelligence)  that made Artificial
          Intelligence, from McCarthy's circumpscription to Moore's autoep-
          istemic  logic,  from Newell's knowledge levels to Pearl's belief
          networks,  from  DeKleers',  Forbus'  and  Kuipers'   qualitative
          reasoning  to  Hayes-Roth's  blackboard systems.  With a chapter-
          tribute to Newell.

          McCarthy's circumpscription starts from the "close-world  assump-
          tion",  that  all relevant information is known (or, all informa-
          tion that is not known can be considered false).
           Boden Margaret: PHILOSOPHY OF ARTIFICIAL  INTELLIGENCE  (Oxford,
          1990)

          A  collection  of  historical  papers,   starting   with   Warren
          McCulloch's  and  Walter  Pitts' "A logical calculus of the ideas
          immanent in the nervous system" (1943).

          In "Computing machinery and intelligence" (1950) Alan Turing pro-
          posed  his  famous  "Turing  test"  to prove whether a machine is
          intelligent or not (a computer can be said to be  intelligent  if
          its  answers  are  indistinghishable  from the answers of a human
          being).

          John Searle's "Mind, Brains and programs" (1980)  summarizes  his
          view  that computers are purely syntactic and therefore cannot be
          said to be  thinking.   His  famous  thought  experiment  of  the
          "Chinese  room" (a man who does not know how to speak chinese but
          is provided by formal rules on how to  build  perfectly  sensible
          chinese answers would pass the Turing test, even if he will never
          know what those questions and those answers  were  about)  opened
          the  floodgates  to  the  arguments  that computation per se will
          never lead to intelligence.

          In the introduction Boden surveys the arguments pro  and  against
          Turing's test and the possibility of thinking machines.

          Drew McDermott's "A critique of pure reason" (1987) is a critique
          specifically  of  Pat  Hayes' attempt at formalizing common-sense
          knowledge. Most of reasoning is not deductive and therefore  can-
          not  be reduced to first-order predicate logic.  McDermott proves
          that all logistic approaches, in particular non-monotonic  logics
          as  the  one  advocated by McCarthy (circumscription), yield very
          weak solutions to the problem  of  representing  knowledge  in  a
          tractable  way:  one cannot write axioms independent of a program
          for manipulating them if the inferences to be performed from them
          are not deductions.

          In "Motives, mechanisms and emotions" Aaron Sloman analyzes  emo-
          tions  as  states  in  which powerful motives respond to relevant
          beliefs by triggering  mechanisms  required  by  resource-limited
          systems.   An  autonomous  system  having many motives and finite
          resources  is  prone  to  internal  conflicts  whose   resolution
          requires  emotion-based  mechanisms.   Emotion  is not a separate
          subsystem of the mind, but a pervasive  feature  of  it.   Sloman
          even proposes a generative grammar for emotions.

Boden Margaret: THE CREATIVE MIND (Basic, 1992)

An analysis of human creativity.

Bogdan Radu: GROUNDS FOR COGNITION (Lawrence Erlbaum, 1994)

          Bogdan' teleo-evolutionary theory claims that  cognitive  systems
          are  guided  by  the  environment  in their goal-driven behavior.
          Cognitive systems actually are the product  of  the  evolutionary
          pressure  of  guiding behaviors towards goals. Organisms are sys-
          tems that are genetically programmed to  maintain  and  replicate
          themselves,  therefore they must guide themselves to their goals,
          therefore they need to obtain relevant  information  about  their
          environment,  therefore they need to be cognitive.  It makes evo-
          lutionary sense that cognition should  appear.   Central  to  his
          thinking  is  the  concept  of "goal-directedness", the result of
          prebiological evolution which is constantly reshaped  by  natural
          selection.    Natural  selection  presupposes  goal-directedness.
          Goal-directedness arises from the genes themselves, which operate
          goal-directedly.

          Organisms manage to survive and multiply in a  hostile  world  by
          organizing  themselves  to  achieve specific, limited goals in an
          ecological niche.  To pursue their goals, organisms  evolve  ways
          to  identify  and  track  those  goals. Such ways determine which
          knowledge is necessary. To obtain such knowledge, organisms learn
          to exploit pervasive and recurrent patterns of information in the
          world. The information tasks necessary to manipulate such  infor-
          mation  "select" the appropriate type of cognitive faculties that
          the organism must be capable of.
           Bond Alan & Gasser Leslie: READINGS  IN  DISTRIBUTED  ARTIFICIAL
          INTELLIGENCE (Morgan Kaufman, 1988)

          A collection of articles, and a subject-indexed bibliography.

          Distributed information processing systems,  i.e.  collection  of
          "intelligent"  agents, embody a variety of strategies of decompo-
          sition and coordination.  Research in distributed A.I. focuses on
          such  methods,  and  on  the  forms of interaction that make such
          methods effective.

          Mike Georgeff  discusses  multi-agent  planning.  Barbara  Hayes-
          Roth's  "A  blackboard  architecture  for  control"  is included.
          Frederick Hayes-Roth discusses  ABE.   Also  articles  by  Victor
          Lesser, Carl Hewitt, etc.

          Gasser thinks, with Mead, that  intelligent  behavior  is  essen-
          tially a social behavior and emphasizes the social aspects of the
          interaction among intelligent agents.
           Brachman Ronald: READINGS IN  KNOWLEDGE  REPRESENTATION  (Morgan
          Kaufman, 1985)

          A collection of  milestone  essays  on  the  topic  of  knowledge
          representation  from  a  semantic  perspective  and  of knowledge
          representation frameworks (mainly semantic networks and  frames).
          It  includes  Pat Hayes' "The logic of frames" (1979) and William
          Woods' "What's in a link" (1975).
          Hayes proves that the language of frames (with the  exclusion  of
          stereotipical  reasoning)  can be reduced to a notational variant
          of predicate logic.  A frame is a micro-theory which allows  very
          rapid  inferences.  On the other hand, stereotipical reasoning of
          default values goes against the monotonicity of classical logic.

          Woods highlights that a semantic network confuses  two  types  of
          representation:  assertions and definitions (a taxonomic relation
          between concepts).  A concept  is  equivalent  to  a  first-order
          predicate.  As  first-order  predicate  logic  cannot  handle the
          intension of a concept, a semantic network must exhibit the  same
          limitation. Woods proposes to define a concept as a set of suffi-
          cient and necessary conditions.

          Ross Quillian's "Word concepts" (1967) originated the idea  of  a
          semantic  network of nodes interconnected with associative links.
          Marvin Minsky's "A framework for representing  knowledge"  (1975)
          presented  the fundamental idea of a frame, a knowledge represen-
          tation formalism based on prototypes, defaults, multiple perspec-
          tives, analogies and partial matching.

          James Allen's "Maintaining knowledge  about  temporal  intervals"
          (1983)  claims that common sense's time is subject to a number of
          principles, such as relativity (a date is usually specified rela-
          tive  to  another  date)  and  decomponibily  (any  event  can be
          described as a sequence of component events that  take  place  in
          the  same interval). These principles state the preminence of the
          "interval" of time (time as partial ordering of  intervals)  over
          the "instant" of time (time as total ordering of instants).
           Brady  Michael  &  Berwick  Robert:  COMPUTATIONAL   MODELS   OF
          DISCOURSE (MIT Press, 1983)

          Bonnie Webber is looking  for  a  formal  language  to  represent
          utterances.   In  addition,  Candace  Sidner  also tries to track
          discourse entities (especially, the focus) over the entire  dura-
          tion  of discourse; that involves an understanding of how (defin-
          ite) anaphoras work.

          James Allen thinks that minds are connected to  objects  via  the
          causal  connection between actions and objects, i.e.  via beliefs
          and desires.  Allen is trying  to  marry  Austin's  and  Searle's
          theory  of  speech  acts with Artificial Intelligence's theory of
          planning by assuming that speech acts are just  particular  cases
          of  actions that, like all actions, must be planned.  The speaker
          that asks a question must have a plan of speech acts in mind and,
          in  order  to  answer appropriately, the other speaker must first
          unravel that plan. Understanding the purpose of a question  helps
          understand indirect speech acts.

Brandon Robert: GENES ORGANISMS POPULATION (MIT Press, 1984)

          A collection of seminal papers on the subject  of  the  level  at
          which natural selection operates.

          Evolutionary theory is based upon the idea  that  species  evolve
          and  their  evolution  is  driven  by natural selection, but what
          exactly evolves and what natural selection acts on is  still  not
          clear.   Nature is organized in a hierarchy: genes are located on
          chromosomes, chromosomes are located  in  cells,  cells  make  up
          organs  which  make up organisms which make up species which make
          up populations which make  up  ecosystems:  at  what  level  does
          selection act?

          Darwin's theory implies that what evolves  is  a  population  and
          what  selection  acts on are the competing organisms of a genera-
          tion within the population.

          Alfred Russel Wallace thinks that selection acts  on  populations
          as  well as individuals.  Wynne-Edwards (1963) thinks that selec-
          tion acts on groups of organisms.  Ernst Mayr (1975) thinks  that
          genes cannot be treated as separate, individual units, that their
          interaction is not negligible. The units of evolution and natural
          selection  are not individual genes but groups of genes tied into
          balance adapative systems. Natural selection  favors  phenotypes,
          not genes or genotypes.

          Lewontin thinks that all entities that exhibit heritable variance
          in  fitness  (from  prebiotic molecules to whole populations) are
          units of selection.

          William Wimsatt thinks that  the  notion  of  selection  must  be
          grounded  around the notion of "additive variance". This quantity
          determines the rate of evolution. Variance in fitness is  totally
          additive  when  the  fitness  increase  in a genotype is a linear
          function of the number of genes of a given type  present  in  it.
          Additivity  can  be  proven  to  be  a  special  case of context-
          independence. If variance in fitness at a given level is  totally
          additive,  then  this  is  the  highest  level at which selection
          operates (the entities at that level are  composed  of  units  of
          selection, and there are no higher-level units of selections).

          Robert Brandon distinguishes levels of selection  from  units  of
          selection.

          David Hull distinguishes replicators (units that reproduce  their
          structure  directly,  such  as  genes) from interactors (entities
          that interact directly with their  environment,  such  as  organ-
          isms).  Differences in the interactions of interactors with their
          environment result in differential reproduction of reproductors.

          Hamilton (1975)'s kin-selection theory, and more  general  group-
          selection theories are also introduced.
           Brandon  Robert:  ADAPTATION  AND  ENVIRONMENT  (Princeton  Univ
          Press, 1990)

          Natural selection is  defined  as  the  process  of  differential
          reproduction  due  to  differential fitness to a common selective
          environment.  The "selective" environment (measured in  terms  of
          the  relative  fitnesses  of  different  genotypes across time or
          space) is distinguished from the "external" environment  and  the
          "ecological"  environment  (measured using the organism itself as
          the measuring instrument so that only that part of  the  external
          environment  that  affects the organism's contribution to popula-
          tion growth is taken into account).  The selective environment is
          the one that is responsible for natural selection.

          Following David Hull, Brandon generalizes phenotype and  genotype
          to  "interactor" (Dawkins' "vehicle") and "replicator" and posits
          that  selection  occurs  among  interactors.  The  biosphere   is
          hierarchically  arranged and, in agreement with Lewontin, natural
          selection applies  to  any  level  of  the  hierarchy.  Selection
          applies  at  different  levels  of  the hierarchy of interactors.
          Interactors can be lengths of RNA or species, or even replicators
          (but even they behave as interactors when "naturally selected").

          Brandon thinks that adaptation defines the function of a property
          of  the  organism.  The only process one needs to study to under-
          stand the properties of a living organism are those that  contri-
          bute to adaptation.

           Bresnan Joan: MENTAL REPRESENTATIONS  OF  GRAMMATICAL  RELATIONS
          (MIT Press, 1982)

          A monumental work on grammars as mental representations, that led
          to  the  definition  of a lexical functional grammar. Half of the
          chapters are by Bresnan in person.

          Bresnan's lexical functional grammar posits the existence  of  an
          intermediary  functional  level  between syntactic structures and
          semantic structures.  Two levels of syntactic structures are pos-
          tulated:  constituent (a standard context-free surface parse of a
          sentence) and functional (generated by equations associated  with
          the context-free rules).  Transformations are avoided in favor of
          a richer lexicon and links between nodes in the  constituent  and
          functional structures.
           Brillouin Leon: SCIENCE AND INFORMATION THEORY (Academic  Press,
          1962)

          A seminal book on information theory, which employed  the  theory
          of  thermodynamics  to  formulate  the  "negentropy  principle of
          information".

          A basic point is that information does not reside within the sys-
          tem and is thus phenomenological.

          Entropy (a measure of randomness in  the  state  of  the  system)
          measures the lack of information.

          Information is defined as the amount of uncertainty which existed
          before  a  choice  was  made.  Information is thus the difference
          between the entropy of the observed state of the system  and  its
          maximum possible entropy.

          Brillouin proved that the minimum entropy cost for obtaining  one
          bit of information is 10 to the -23 joules per degree K.

Broadbent Donald: PERCEPTION AND COMMUNICATION (Pergamon, 1958)

          Broadbent is one of the psychologists who identified two types of
          memory,  a "short-term memory", limited to few pieces of informa-
          tion, capable of retrieving them very quickly and  decaying  also
          very  quickly, and a "long-term memory", capable of large storage
          and much slower in both retrieving and decaying. Broadbent thinks
          that short-term memory is a set of pointers to blocks of informa-
          tion located in the long-term memory.

          Broadbent enunciated  the  principle  of  "limited  capacity"  to
          explain how the brain can focus on one specific object out of the
          thousands perceived by the retina.  The  selective  character  of
          attention  is  due  to  the limited capacity of processing by the
          brain, which can only be conscious of so many events at the  same
          time.  Attention originates from a multitude of attentional func-
          tions in different subsystems of the brain.

          Broadbent's 1958 model of memory reflected well-known features of
          memory:  information about stimuli is temporarily retained but it
          will fade unless attention is turned quickly to  it.   The  unat-
          tended  information is "filtered out" without being analyzed.  He
          draws a distinction between a sensory store of  virtually  unlim-
          ited capacity and a categorical short-term store of limited capa-
          city.  This is the way that a  limited-capacity  system  such  as
          human memory can cope with the overwhelming amount of information
          available in the world.

          Broadbent proposes a block diagram which  was  similar  to  those
          used  by computer science, thereby approaching the first computa-
          tional model of memory.

Broadbent Donald: DECISION AND STRESS (Academic Press, 1971)

          In 1971 Broadbent modified his original information-flow model of
          1958  by  taking into account new physiological and psychological
          findings. Foremost among the  changes  is  that  stimuli  may  be
          selected  by the attentional filter on the basis of semantic pro-
          perties, besides their physical properties.

          In 1984 Broadbent will also propose  his  "maltese  cross"  model
          consisting  of  four  stores  (sensory, short-term, long-term and
          motor output) with a central processing unit  that  controls  the
          flow of information among them.
           Brooks Daniel & Wiley E.O.: EVOLUTION AS ENTROPY (Univ  of  Chi-
          cago Press, 1986)

          The goal of this unified theory  of  evolution  is  to  integrate
          Dollo's  law  (the  irreversibility of biological evolution) with
          natural selection.  Natural  selection  per  se  only  states  an
          environmental  constraint, but no directionality in time. Dollo's
          law is considered as a biological manifestation of the second law
          of thermodynamics.

          Unlike Prigogine, Wiley and Brooks believe that  biological  sys-
          tems  are  inherently different from dissipative structures. Bio-
          logical systems owe their order and organization to their genetic
          information,  which  is  inherent  and  inheritable.  Both during
          growth and during evolution  entropy  of  biological  information
          constantly  increases.   Evolution  is  a  particular case of the
          second law of thermodynamics and biological  order  is  a  direct
          consequence of it.

          The creation of new species is made necessary by the  second  law
          and is a "sudden" phenomenon similar to phase changes in Physics.
          Phylogenetic branching is an inevitable increase in informational
          entropy.   The interaction between species and the environment is
          not as important in molding evolution: natural  selection  mainly
          acts  as  a  pruning  factor.   Species are systems in a state of
          non-equilibrium and new species  are  created  according  to  the
          second law.

          Biological systems differ from physical  dissipative  systems  in
          that  their  order  is  based on properties that are inherent and
          heritable.  Their relevant phase space  is  genetic.   The  total
          phylogeny  is  characterized  by an ever increasing genetic phase
          space.  Dissipation in  biological  systems  is  not  limited  to
          energy  but also involves information. Information is transmitted
          to subsequent generations.

          Unlike most theories of  information,  that  use  information  to
          denote  the  degree  to  which  external  forces create structure
          within a system, Brooks-Wiley's information  resides  within  the
          system  and  is  material, it has a physical interpretation. Such
          information resides  in  molecular  structure  as  potential  for
          specifying homeostatic and ontogenetic processes. As the organism
          absorbs energy from the environment, this potential is actualized
          and is "converted" into structure. Over short time intervals bio-
          logical systems behave like dissipative structures.  Over  longer
          time intervals they behave like expanding phase space systems.

          In concluding, by studying entropy in biological  systems,  Wiley
          and  Brooks  propose  a  nonequilibrium  approach  to  evolution.
          Reproduction, ontogeny and phylogeny are examples  of  biological
          organization that exhibit irreversible behavior.  Biological sys-
          tems are nonequilibrium systems.
           Brooks Rodney & Luc Steels: THE ARTIFICIAL LIFE ROUTE TO ARTIFI-
          CIAL INTELLIGENCE (Lawrence Erlbaum, 1995)

          A collection of papers on the paradigm of situated cognition.

          Brooks' 1991 papers, "Intelligence  without  representation"  and
          "Intelligence  without  reason",  were instrumental in creating a
          new, "situated" approach to cognition by emphasizing the interac-
          tion between an agent and its environment.

          Situated agents have no knwoledge. Their memory is not a locus of
          representation but simply the place where behavior is generated.

          In Brooks' subsumption architecture behavior is determined by the
          structure of the environment. The cognitive system has no need to
          represent the world, but only how to operate in the world.  There
          is  no centralized function that coordinates the entire cognitive
          system, but a  number  of  distributed  decisional  centers  that
          operate  in  parallel,  each of them performing a different task.
          The system does not have the explicit representation of  what  it
          is  doing.  It  does  have parallel processes that represent only
          their very limited goal.

          The system decomposes in layers  of  goal-driven  behavior,  each
          layer being a network of finite-state automata, and incrementally
          composes its behavior through the interaction with the world.

          Brooks can therefore account for the response times  required  in
          the real world. In the real world there is no clearcut difference
          between perception, reasoning and action.

          Brooks' tolemaic revolution in cognitive science turns  the  mind
          into  one  of  many  agents  that  live  in  the environment. The
          environment is the center of the action, not the mind.

          The environment is action, continous action,  continously  chang-
          ing.  Only a system of separate, autonomous control systems could
          possibly react and adapt to such a context.

          The world contains all the information that the  organism  needs.
          Therefore  there  is  no  need  to  represent it in the mind. The
          environment acts like a memory external  to  the  organism,  from
          which  the  organism can retrieve any kind of information through
          perception.

          "Intelligent" behavior can be partitioned into a set of asynchro-
          nous  tasks (eating, walking, etc), each endowed with a mechanism
          of perception and action.  An organism can be built incrementally
          by gradually adding new tasks.

          In other words, every intelligent being has a body!

          Cognition is rational cinematics.

Brown Frank: THE FRAME PROBLEM (Morgan Kaufmann, 1987)

          Proceedings of a workshop on  the  frame  problem.   Yoav  Shoham
          identifies  a  qualification  problem  and an extended prediction
          problem that subsume the frame problem. Frank  Brown  presents  a
          modal logic approach. Matthew Ginsberg's "Reasoning About Action"
          offers a solution based on the search for  the  nearest  possible
          world to the current one.

Bruner Jerome: A STUDY OF THINKING (Wiley, 1956)

          A book that helped launch the cognitive revolution in  Psychology
          and Philosophy.  Bruner concentrates on how human beings categor-
          ize objects.  All cognitive activity depends upon the process  of
          categorizing  events.   A category is a set of events that can be
          treated as if they were equivalent.  Bruner employs techniques of
          game  theory and communication theory to explain how the environ-
          ment is partitioned into  equivalence  classes.   Concept  forma-
          tion,  or  "attainment",  is  achieved  via a number of selection
          (choice of instances)  and  reception  (revision  of  hypothesis)
          strategies.   In general, though, subjects categorize with proba-
          bilistic cues.

Bruner Jerome: ACTS OF MEANING (Harvard University Press, 1994)

          A manifesto of methodology from the man  who  set  up  the  first
          Center  for  Cognitive Studies (in Cambridge, MA, in the Sixties)
          proposes a "cultural psychology" that is centered on meaning, not
          information,  and on the construction of meaning by the mind, not
          on the processing of  information  by  the  mind.  To  understand
          humans  one  must  understand how their experiences are shaped by
          their intentional states. The form of  these  intentional  states
          depend  upon  the  symbolic  systems of their culture. Biological
          inheritance merely imposes constraints on action. Culture enables
          humans  to  transcend those biological limits. Folk psychology is
          but a device for people to organize their  views  of  themselves,
          the  others  and the world they share with them.  Folk psychology
          is not grounded on a logical system, but on  narratives.   Narra-
          tive skills arise somehow from a biological need to narrate. Even
          selves must be viewed in the context of culture  and  society:  a
          self is distributed interpersonally.
           Buchler  Justus:  METAPHYSICS  OF  NATURAL  COMPLEXES  (Columbia
          University Press, 1966)

          A general discussion of complexity from a philosophical point  of
          view.   The  world  is  unlimitedly complex and complexity if the
          result of multiple relatedness among processes.   Buchler  adopts
          an ontology of processes instead of things.

Buck Ross: THE COMMUNICATION OF EMOTION (Guilford Press, 1984)

          HUman behavior is a function of several systems of  organization:
          innate  special-purpose  processing systems (reflexes, instincts,
          etc) concerned with bodily  adaptation  and  the  maintenance  of
          homeostasis  and that employ a holistic, syncretic type of cogni-
          tion (knowledge by  acquaitance);  and  acquired  general-purpose
          processing  systems,  concerned with making sense of the environ-
          ment and that employ sequential, analytic cognition (knoweldge by
          description).  The  former  (associated with the right emisphere)
          carry out spontaneous communication involving  emotional  expres-
          sion,  the  latter (associated with the left emisphere) carry out
          symbolic communication  involving  propositions.  The  former  is
          primitive,  the latter also requires the former, and may be based
          upon it both phylogenetically and ontogenetically.

          Buck grounds his model of communication of emotions  on  Shannon-
          Weaver's   theory   of   communication   and  assumes  that  such
          communication occurs via two parallel  streams,  one  spontaneous
          (emotions) and one symbolic (propositions).

          Communication occurs when the behavior of  an  individual  influ-
          ences  the  behavior of another individual. Communication of emo-
          tions, in particular, is a biologically shared signal system that
          has been created through an evolutionary process.

          Emotion is defined as a readout of  motivational  systems.   Buck
          identifies  three functions of emotions: bodily adaptation to the
          environment, social communication with  other  aware  beings  and
          subjective  experience.  All  originate from motives that must be
          satisfied. The emotion is a measure of how  far  they  have  been
          satisfied.

          Buck provides both a general cognitive model of emotions and a  a
          detailed physical model of their neural processes.

          Buck thinks that behavior is governed by biological,  or  innate,
          epistemic, or acquired, and rational, or processed, factors.
           Bundy Alan: THE COMPUTER  MODELLING  OF  MATHEMATICAL  REASONING
          (Academic Press, 1983)

          Bundy introduces the notation of propositional logic  and  predi-
          cate   logic,  higher-order  logics  and  lambda  calculus.  Then
          explains how a computer can perform automatic theorem proving  by
          using  resolution,  along the way defining Horn clauses, Kowalski
          form and Skolem normal form. The book  also  touches  on  Douglas
          Lenat's concept formation and Daniel Bobrow's theory formation.

Bunge Mario: TREATISE ON BASIC PHILOSOPHY (Reidel, 1974-83)

          A  monumental  seven-volume  synthesis  of  modern  philosophical
          themes.

          Volume one deals with  sense  and  reference.  Reference  is  non
          equated  to  extension.  Intension  is  an  irreducible  semantic
          object.  The sense of a construct is relative to  the  theory  in
          which it occurs (sense depends on the context).

          Volume two deals with interpretation and truth.  Meaning is sense
          together with reference. Meaning is not verifiability, truth con-
          ditions, information, etc. Bunge develops a calculus of  meaning.
          A  truth  measure  function (a continous function) allows for the
          expression of partial truth, or degrees of truth.

          Volume three and four deal with ontology (substance,  properties,
          change, spacetime).  Reality is the aggregation of things holding
          spatiotemporal relations: spacetime can  be  understood  only  in
          terms  of  changing things. Spacetime must be anchored to things,
          not the other way around.  A system is identified by  three  com-
          ponents: its composition, environment and structure. The universe
          is a system composed of subsystems.  Everything is a system or  a
          system component.

          Organisms are particular systems with  emergent  properties.  The
          unit  of  biological  study  is  the  organism-in-the-environment
          together with its subsystems  (from  cells  to  organs)  and  its
          supersystems  (from  population to biosphere). The mind is a col-
          lection of processes of neural systems.  Society is a system made
          of people linked by social relations.

          Volume five and six  deal  with  epistemology.   Every  cognitive
          activity  is  a  neural  process.   Language  is for transmitting
          knowledge and influencing behavior.  Perception yields a  subjec-
          tive   type   of   knowledge.  Conceptualizing  yields  objective
          knowledge. Perception is like copying reality to the brain.  Con-
          ceptualizing  goes  beyond mere copying: it can form new proposi-
          tions out of nonpropositional knowledge (percepts) or it can form
          new  propositions  out of old propositions (inferring). Inference
          yields new propositions, not new concepts.
           Bunt Harry: MASS-TERMS AND MODEL-THEORETIC SEMANTICS  (Cambridge
          Univ Press, 1985)

          The book deals with the semantic problems related to  mass  nouns
          (such  as  "water", "music", "luggage", etc), as opposed to count
          nouns. The semantics for mass terms is built on  ensemble  theory
          (an extension of mereology built around the concept "part of").

Buss David: THE EVOLUTION OF DESIRE (Basic, 1994)

          Research on  sexual  behavior  reveals  a  distinct  gender  gap.
          Natural  selection has molded the brains of men and women in very
          different ways as a result of their different reproductive goals.
           Cairns-Smith  A.  G.:  GENETIC  TAKEOVER  (Cambridge  University
          Press, 1982)

          Cairns-Smith argues that the first living beings were not  carbon
          composts  but  metallic crystals, i.e. minerals. Life's ancestors
          were self-replicating patterns of defects in  metallic  crystals.
          One  day  those  patterns started replicating in a different sub-
          stance, carbon molecules.
           Cairns-Smith A. G.:  EVOLVING  THE  MIND  (Cambridge  University
          Press, 1995)

          The author reviews theories of  consciousness  and  is  skeptical
          about the possibility of deriving consciousness from matter.

Calvin Melvin: CHEMICAL EVOLUTION (Clarendon, 1969)

          Calvin explores different autocatalytic scenarios for the  origin
          of  life which assume life spontaneously bootstrapped itself from
          simple molecules and don't require any unlikely event to  produce
          very complex molecules.

Calvin William: THE ASCENT OF MIND (Bantam, 1991)

          Calvin looks for the causes of the evolution of the  human  brain
          in ice-age climates.

          The brain got bigger and bigger through  a  three-part  cycle  of
          evolutionary alterations in body proportions which involves a set
          of genes that regulate fetal and childhood growth.

Calvin William: THE CEREBRAL CODE (MIT Press, 1996)

Campbell John: PAST, SPACE AND SELF (MIT Press, 1994)

          Campbell examines how human thinking about space and time differs
          from  animals'  thinking  about space and time (in particular the
          ability to think about the  past).  Campbell  then  examines  the
          consequences on self-consciousness.

Carbonell Jaime: MACHINE LEARNING (MIT Press, 1989)

          Contains nine articles from prominent researchers in the area  of
          machine  learning.  Carbonell's  introduction compares the tradi-
          tional inductive paradigm (constructing the symbolic  description
          of  a concept from a set of positive and negative instances) with
          the new analytic (i.e., deductive) paradigms. The latter  utilize
          past  problem solving experience to formulate the search strategy
          in the space of potential solutions. Deductive  learning  systems
          include:   Jerry  DeJong's  "explanation-based  learning",  Allen
          Newell's "chunking", and Carbonell's own "derivational analogy".

          Pat Langley and others cover concept formation, reviewing histor-
          ical  systems  such  as  Langley's own BACON, Doug Lenat's AM, Ed
          Feigenbaum's  EPAM,  Michael  Lebowitz's  UNIMEM,  Doug  Fisher's
          COBWEB, Jan Zytkow's FAHRENHEIT.

          An  explanation-based  learning  system  is  given  a  high-level
          description  of the target concept, a single positive instance of
          the concept, a description of what a concept  definition  is  and
          domain knowledge.  The system generates a proof that the positive
          instance satisfies the target concept and  then  generalizes  the
          proof.   Richard  Fikes'  STRIPS is recognized as a forerunner of
          explanation-based learning.

          Derivational  analogy  solves  a  problem  by  tweaking  a   plan
          (represented  as  a  hierarchical goal structure) used to solve a
          previous problem.  Jack Mostow surveys a few applications.

          John Holland and Geoffrey Hinton touch briefly on two alternative
          and extreme paradigm, respectively genetic algorithms and connec-
          tionism.

          Holland's "Classifier systems and  genetic  algorithms"  provides
          his definitive version of classifier systems.

          Classifier systems are defined as "massively  parallel,  message-
          passing,  rule-based systems that learn through credit assignment
          (the bucket brigade algorithm) and rule  discovery  (the  genetic
          algorithm)".   When  a  message  from the environment matches the
          antecedent of a rule, the message specified in the consequent  of
          the  rule  is produced. Some messages produced by the rules cycle
          back into the classifier system,  some  generate  action  on  the
          environment. A message is a string of characters from a specified
          alphabet. The rules are not written in the first-order  predicate
          logic of expert systems, but in a language that lacks descriptive
          power and is limited to simple conjunctive expressions.

          Credit assignment is the process whereby the system evaluates the
          effectiveness of its rules.  The bucket brigade algorithm assigns
          a strength (a maesure of its past usefulness) to each rule.  Each
          rule  then  makes  a bid (proportional to its strength and to its
          relevance to the current situation) and only the highest  bidding
          rules are allowed to pass their messages on. The strengths of the
          rules are modified according to an economic analogy: every time a
          rule  bids, its strength is reduced of the value of the bid while
          the strength  of its "suppliers" (the rules that  sent  the  mes-
          sages  matched by this bidder) are increased. The bidder strength
          will in turn increase if its consumers (the  rules  that  receive
          its  message)  will  become  bidders.  This  leads  to a chain of
          suppliers/consumers whose success ultimately depends on the  suc-
          cess of the rules that act directly on  the environment.

          Then the system replaces the least useful (weak) rules with newly
          generated  rules  that  are  based  on  the  system's accumulated
          experience,  i.e.  by  combining   selected   "building   blocks"
          ("strong" rules) according to some genetic algorithms.

          Hinton focuses on gradient-descent learning procedures of connec-
          tionist  systems.  Each  connection computes the derivative, with
          respect with its strength, of a global measure of  error  in  the
          performance  of the network, and then adjusts its strength in the
          direction that decreases the  error.   Hinton  is  interested  in
          learning  procedures that lead to internal representations of the
          environment.  His survey starts with associative memories without
          hidden  units  (linear  and nonlinear associators) and supervised
          networks without hidden units (least squares and perceptron  con-
          vergence   algorithms)   and  proves  the  deficiencies  of  such
          approaches. Backpropagation (a multi-layer  least  squares  algo-
          rithm)  can  instead  lead to the discovery of semantic features,
          but  it  too  exhibits  limitations,  specifically  computational
          intractability and biological implausibility.

          Hinton also surveys Boltzmann machines (where units update  their
          state  based  on  astochastic  decision  rule),  Hebbian learning
          (where weight modification depends on both presynaptic and  post-
          synaptic activity), competitive learning (where units in a hidden
          layer compete to become active) and reinforcement learning (where
          credit is assigned to a local decision by measuring how it corre-
          lates with the global reinforcement signal.

          John Anderson's "A theory of origins of human knowledge" general-
          izes the results of his systems, in particular ACT and his latest
          PULPS. They organize knowledge in three levels:  knowledge  level
          (information  acquired from the environment and innate principles
          of inference), algorithm level (internal  deductions,  inductions
          and compilations) and implementation level (setting strengths for
          the encoding of specific pieces of information).
           Carvalo Marc: NATURE, COGNITION  AND  SYSTEM  (Kluwer  Academic,
          1988)

          A collection of articles on cybernetics applied to the nature  of
          living  systems,  autopoiesis  and  self-organization. One of the
          main themes is that of the "two arrows of time": the  second  law
          of thermodynamics pointing towards entropy increase and therefore
          disorder increase, and evolution pointing the other way by build-
          ing increasingly complex structures of order.
           Castaneda Hector-Neri: THINKING, LANGUAGE,  EXPERIENCE  (Univer-
          sity of Minnesota Press, 1989)

          The book advances a general semantics of thinking  that  accounts
          for  the  unity  of  experience: "guise theory". According to its
          ontological scheme, properties are the  building  blocks  of  the
          world.

          Singular reference (reference to individuals insofar as they  are
          thought  of individuals) is achieved through a combination of one
          of four linguistic mechanisms: indexical reference (required  for
          a person to have experience), quasi-indexical reference (required
          to conceive  of  other  subjects  with  experience),  descriptive
          reference  and  reference by proper names.  We refer to ourselves
          and to objects indexically.

          Believing and intending partition the class of mental  states  in
          two  categories, corresponding to contemplative thinking ("propo-
          sitions") and practical thinking ("practitions").

          Proper names are not individuating devices (they are not  genuine
          singular  terms,  they  are  free  variables  of quantification).
          Proper names have an epistemic role (they are useful to  organize
          beliefs)  and a causal role (they allow the retrieval of informa-
          tion).

          The individuality of an individual consists in the  set  of  that
          individual's  differences  from  everything else, the set of dif-
          ferentiating  properties.   The  units   of   individuation   are
          "guises".

          Castaneda emphasized  the  fundamental  indexality  of  practical
          thinking  (exercized  in  acts  of willing, commanding, advising,
          etc).  Indexical reference is the backbone of  perceptual  refer-
          ence. Indexical reference is experiential reference. Therefore, a
          theory of indexical reference (and  a  semantics  of  indicators)
          depends on a theory of perception.

          In order to deal with indexicals  and  demonstratives,  one  must
          appreciate the difference between sense and meaning: the word "I"
          has the same meaning, no matter  who  utters  it,  but  different
          senses, and different references.

          Guise theory is a theory of predication. Properties are the ulti-
          mate  components of the world. Concrete objects (or "guises") are
          bundles of properties. A concret object is made of the members of
          a  set  of properties plus an operator: the operator (sort of the
          inverse of the abstraction operator) is what turns the properties
          into  a concret object. For each distinct set of properties there
          is a distinct concrete object that results from  the  application
          of  the  operator on that set. Therefore, "the thing that doesn't
          exist" is a concrete object, because it is made of  a  bundle  of
          properties.

          When assertions of ordinary discourse are made explicit,  proper-
          ties turn out to be predicated of the guises which constitute the
          domain. They are predicated either internally  (if  the  property
          belongs  to  the core of a guise which is the subject of predica-
          tion) or externally. In other words, the  disguised  predications
          of  ordinary discourse are, when made explicit, either internally
          or externally "guised" depending upon the form  of  reference  to
          the subject of predication.

          An object can stand in a number of relationships to  a  property:
          constitution  (the  property  is  a  member  of  the  core of the
          object), identity, consubstantiation, consociation, conflation.
           Chalmers David: THE CONSCIOUS  MIND  (Oxford  University  Press,
          1996)

          Chalmers argues that consciousness cannot  be  explained  with  a
          reductionist approach, because it does not belong to the realm of
          matter. Chalmers proposes to expand science in a fashion that  is
          still  compatible  with today's science (in the areas where it is
          successful) and that allows for a dualist approach.

          Chalmers distinguishes between a phenomenal concept of mind  (the
          way it feels) and a psychological concept of mind (what it does).
          Every mental property is either a phenomenal property, a  psycho-
          logical  one  or a combination of the two.  The mind-body problem
          is therefore made of two parts, one that deals  with  the  mental
          faculties  and one that deals with how/why those mental faculties
          also give rise to  awareness  of  them  (Jackendoff's  "mind-mind
          problem").   The  same distinction applies to consciousness, with
          psychological consciousness being commonly referred to as "aware-
          ness"  (phenomenal  consciousness always comes with psychological
          consciousness). Awareness is having access  to  information  that
          may affect behavior.

          Chalmers shuns the problem of identity and prefers  to  focus  on
          the notion of supervenience. Consciousness supervenes on the phy-
          sical, just like biological properties supervene on physical pro-
          perties  (any  two  situations  that are physically identical are
          also biologically identical). Chalmers defines logical superveni-
          ence  (to  be  interpreted loosely as "possibility", and as logi-
          cally possible worlds that supervene on the physical  world)  and
          natural supervenience (to be interpreted as a real empirical pos-
          sibility, when two sets of properties are systematically and pre-
          cisely  correlated  in  the  natural  world).  Logically possible
          situations are not necessarily also naturally possible situations
          (e.g.,  any  situation that violates the laws of nature). Logical
          supervenience implies natural supervenience, but  not  viceversa.
          A  natural phenomenon can be reduced to a set of lower-level pro-
          perties when it is logically supervenient  on  those  properties,
          i.e.  it  can  be  reduced  to  the physical when it is logically
          supervenient on the physical.  From his  analysis  it  turns  out
          that  "almost everything" is logically supervenient on the physi-
          cal.

          Using arguments about zombies, inverted spectrum, epistemic asym-
          metry with respect to consciousness and Jackson's thought experi-
          ment of the blind neurologist, Chalmers then  proves  that  cons-
          ciousness is naturally, but not logically, supervenient on physi-
          cal properties. That means that it cannot be reduced to the  phy-
          sical.  Chalmers  therefore  criticizes  cognitive  architectures
          (such as Dennett's), neurobiological theories (such as Edelman's)
          and hypotheses based on quantum mechanics (such as Penrose's).

          Chalmers' "naturalistic monism" admits  both  physical  and  non-
          physical  features  in  the world.  His dualism is different from
          Descartes' in that it claims that "consciousness is a feature  of
          the  world" that is somehow related to its physical properties. A
          new, fundamental, irreducible feature (a set of "protophenomenal"
          properties) must be added to space-time, mass, charge, spin, etc,
          and a set of "psychophysical"  laws  (explaining  how  phenomenal
          properties  depend  on  physical properties) must be added to the
          laws of nature.

          Consciousness is viewed  as  "organizationally  invariant",  i.e.
          every system organized in the appropriate way will experience the
          same conscious states, regardless of what substance  it  is  made
          of.  In this sense, a computer can be intelligent and conscious.

          In order to build a scientific theory of consciousness,  Chalmers
          outlines a few candidate psychophysical laws, such as the princi-
          ple of coherence between  consciousness  and  cognition  and  the
          principle of organizational invariance.

          Still looking for  fundamental  laws  of  consciousness,  Chalmer
          offers  an  interpretation  of  his  theory  based on the dualism
          between information and pattern: information is what  pattern  is
          from  the  inside. Consciousness is information about the pattern
          of the self. Information becomes therefore the link  between  the
          physical and the conscious.

          Chalmers also offers his own interpretation of quantum theory:

Changeux JeanPierre: NEURONAL MAN (Pantheon, 1985)

          Changeux is one of the brain scientists  who  maintain  that  the
          mental and the neural are simply two aspects of the same physical
          state.

          From neuroanatomy Changeux derives a view of  the  complexity  of
          the  brain:  the evidence for specific localization of particular
          functions always comes with  evidence  for  diffuse  control  and
          interaction of parts.

          The human brain is priviliged by the (relatively recent) develop-
          ment  of  the neocortex. The human brain contains representations
          of the world in its cortex, is capable  of  building  new  mental
          representations  and is capable of using them for computations. A
          mental object corresponds to the activity of a population of neu-
          rons.

          Changeaus notes that at the level of communication between nerves
          nothing  distinguishes the brain from the peripheral nervous sys-
          tem, or, for that matter, from any other animal.

          Changeux proposes a "neo-darwinian" theory for the development of
          the  set  of nerve connections that underlie memories and percep-
          tions. The nervous system makes very large numbers of random mul-
          tiple  connections.  External stimuli cause differential elimina-
          tion of some connections. Phenotypic variability is the result of
          experience.

          His theory of "epigenesis by selective stabilization of synapses"
          stems  from  a  number  of  observations: the main organizational
          features of the nervous system are determined by a set of  genes;
          phenotypic  variability  increases in organisms with the increase
          in brain complexity; during development connections  are  created
          and  destroyed in large numbers; neurons communicate even at very
          early stages of development.

          The theory explains the nonlinearity between  the  complexity  of
          the  genome and that of brain complexity. The evolutionary advan-
          tage of the human species stems from the  individual,  epigenetic
          variability  in  the  organization  of neurons, which resulted in
          greater plasticity in adapting to the environment.
           Changeux JeanPierre: ORIGINS OF THE HUMAN BRAIN (Oxford  Univer-
          sity Press, 1995)

          A collection of essays from neurobiologists, anthropologists  and
          psycholigists,  covering  the anatomy of the brain, genetics, and
          consciousness/mind.
           Charniak Eugene: ARTIFICIAL INTELLIGENCE  PROGRAMMING  (Lawrence
          Erlbaum, 1987)
          The second edition of a classic textbook of practical  Artificial
          Inteligence techniques (very LISP-oriented).
           Chauvin Yves & Rumelhart David: BACKPROPAGATION  (Lawrence  Erl-
          baum, 1995)

          Theory and practice of the most popular  training  algorithm  for
          neural networks.
           Chierchia Gennaro: DYNAMICS OF MEANING (Univ of  Chicago  Press,
          1995)

          A few linguistic phenomena constitute evidence in favor of a view
          of  meaning  as  "context  change", as opposed to the traditional
          view of  meaning  as  content.   Context  updating  would  be  an
          integral part of  the compositional system of meaning.

          Chierchia  proposes  a  "dynamic  binding"   theory   (based   on
          Montague's  intensional  logic)  as  an  alternative to classical
          "discourse representation theory".

Chierchia Gennaro: MEANING AND GRAMMAR (MIT, 1990)

          A seminal textbook on semantics.
          The empirical domain of semantics is  defined  according  to  the
          linguistic  phenomena  that  a  semantic  theory  is  required to
          account for: entailment (an implication both in  terms  of  truth
          and information that is conveyed), presupposition (an implication
          which does not depend on the truth of  the  premise  because  the
          truth  of  the conclusion is implied in the wording itself of the
          premise), anaphora (expressions that are  connected  to  previous
          expressions), ambiguity (lexical, syntactic and scope ambiguity),
          synonymy (mutual entailment of two expressions), contradiction (a
          sentence  that  can never be true because of incompatible entail-
          ments), anomaly (a sentence that can never  be  true  because  of
          incompatible presuppositions), appropriateness (in the context).

          Theories of meaning include referential or denotational  theories
          (meaning  lies  in  the  relations  of symbols to what they stand
          for), psychologistic or mentalistic  theories  (meaning  lies  in
          their mental representation), social or pragmatic theories (mean-
          ing lies in the social interaction of agents),  but  all  aspects
          should contribute to a complete theory of meaning.

          Problems with denotation (especially Frege's  take  on  reference
          and  sense) and truth (Tarski's correspondence theory) are intro-
          duced. Kripke's and Putnam's causal theory  of  reference  (which
          assumes  a  causal link between a word and what it stands for) is
          sketched.

          Chapters are devoted to: how to derive truth conditions  of  sen-
          tences  containing  quantified  expressions; the relation between
          the meaning of an expression and the meaning of the  speaker  (as
          in  Grice); speech acts (as in Austin and Searle); intensionality
          (as in  Montague);  discoursse  analysis  (indexicals,  contexts,
          filters,  ...);  Lambda abstraction; lexical semantics (including
          thematic roles).
           Child William: CAUSALITY, INTERPRETATION AND  THE  MIND  (Oxford
          University Press, 1994)

          The nature of intentional phenomena, such as belief  and  desire,
          in a causal theory of the mind.

Chomsky Noam: SYNTACTIC STRUCTURES (Mouton, 1957)

          With this book Chomsky striked a fatal blow  at  the  behaviorist
          tradition  of  Skinner and others that research should be focused
          solely on external, measurable stimuli and responses, and not  to
          abstract  mental  entities.   At the same time Chomsky reacted to
          structural linguistics that was content with describing and clas-
          sifying  languages.   Chomsky extended the idea of formal systems
          to linguistics by using the  logical  formalism  to  express  the
          grammar of a language.

          Chomsky's idea was to concentrate on the study  of  grammar,  and
          specifically syntax, i.e. on the rules that account for all valid
          sentences of a language.  The idea was that language is based  on
          a  system  of  rules determining the interpretation of its infin-
          itely many sentences.

          Chomsky argued for the independence of syntax from semantics,  as
          the  notion of a well-formed sentence in the language is distinct
          from the notion of a meaningful sentence.

          The  phrase  structure  model,  based  on  immediate  constituent
          analysis, is a more powerful tool for the purpose of grammar than
          other existing tools, but not adequate enough.  A  grammar  needs
          to  have  a tripartite structure: a sequence of rules to generate
          phrase structure, a sequence of morphophonemic rules  to  convert
          strings  of morphemes into strings of phonemes, and a sequence of
          transformational rules that transform strings with phrase  struc-
          ture  into  new  strings  to  which  the morphophonemic rules can
          apply.

          Chomsky proposed a hierarchy that categorizes languages according
          to  the  complexity  of the grammars that generate them. The sim-
          plest  languages  are  regular  languages,  or   type-3;   type-2
          languages  are  context  free;  type-1 are context-sensitive; and
          type-0 are recursively enumerable languages.  The definitions are
          based  on  the type of rules needed to generate all the sentences
          of the language.

          Chomsky posited the existence  of  two  levels  of  language:  an
          underlying  deep  structure,  which  accounts for the fundamental
          syntactic relationships among language components, and a  surface
          structure,  which  accounts  for  the sentences that are actually
          uttered, and which is generated by transformations of elements in
          the deep structure.

          A generative grammar is a rules system that generates  the  gram-
          matical  sentences  of the language that it describes and assigns
          to each sentence a grammatical analysis.  The  simplest  type  of
          generative  grammar  is  the finite-state grammar, but no natural
          language is finite. In a phrase structure grammar the elements of
          the  sentences  are identified by constituents (noun phrase, verb
          phrase, etc).   In  a  transformational  generative  grammar  the
          phrase  structure  (which  produces  the  "deep  structure"  of a
          sentence) is supplemented by a transformational component  and  a
          morphophonemic component (which transform the deep structure into
          the surface structure of the sentence,  e.g.  active  or  passive
          form).

          Chomsky's computational approach had  its  flaws.   Each  Chomsky
          grammar  is equivalent to a Turing machine. From Godel's theorem,
          the processing of a Turing machine may  never  come  to  an  end.
          Therefore  a  grammar  may never find the meaning of a valid sen-
          tence, although we have no evidence that our brain may never find
          the  meaning  of  a  valid sentence in our language.  Later, Gold
          proved that no amount of correct examples of sentences are enough
          to learn a language.

          The  book  was  one  of  the  milestones  of  cognitive  science.
          Chomsky's  formal  method  was  influenced  by mathematical logic
          (particularly  formal  systems)  and  the  computer  model   (the
          information-processing paradigm).

Chomsky Noam: ASPECTS OF THE THEORY OF SYNTAX (MIT Press, 1965)

          In order to explain the  difference  between  "performance"  (all
          sentences that an individual will ever use) and "competence" (all
          sentences that an individual can utter, but will not  necessarily
          utter),  Chomsky  posits  the existence of some innate knowledge.
          Chomsky proved that the grammar of a natural language  cannot  be
          reduced  to  a finite-state automaton. Later, Gold proved that no
          amount of correct examples of sentences are  enough  to  learn  a
          language.

          Chomsky argues for the existence of two levels  of  language:  an
          underlying  deep  structure,  which  accounts for the fundamental
          syntactic relationships among language components, and a  surface
          structure,  which  accounts  for  the sentences that are actually
          uttered, and which is generated by transformations of elements in
          the  deep  structure. Transformational analysis does overcome the
          limitations of phrase structure.

          Chomsky's "standard theory" defines a grammar as made of  a  syn-
          tactic component (phrase structure rules, lexicon and transforma-
          tional component), a semantic component and a  phonological  com-
          ponent.  The  lexicon  is  modeled after Katz's lexicon. Context-
          sensitive rules determine the legal positions in the sentence  of
          lexical  items.  The semantic component is also inspired by Katz,
          as it uses projection rules and semantic markers.

          The deep structure of a sentence is a tree  (the  phrase  marker)
          that  contains  all  the  words  that  will appear in its surface
          structure.

          Chomsky starts coupling syntax  and semantics when  including  an
          account  of  the  relation  between sound and meaning in the con-
          struction of a grammar.  The "standard  theory"  syntax  provides
          the mechanisms for transforming a meaning (a deep structure) into
          a phonetic representation (a surface structure).

          Chomsky decomposes a user's knowledge of language into  two  com-
          ponents:  a universal compenent (universal grammar), which is the
          knowledge of language possessed by every  human,  and  a  set  of
          parameter  values  and  a  lexicon, which together constitute the
          knowledge of a particular language.
           Chomsky Noam &  Halle  Morris:  THE  SOUND  PATTERN  OF  ENGLISH
          (Harper & Row, 1968)

          A classical textbook on generative phonology.  Besides  detailing
          the  formal structure of a phonological theory, the book tried to
          define a way in the formal expressions of  these  processes  that
          would  predict which phonological processes were likely and which
          were not. An evaluation  metric  ranks  rules  according  to  how
          likely they are to occur (inversely proportional to the number of
          features needed to express it).

Chomsky Noam: REFLECTIONS ON LANGUAGE (Pantheon, 1975)

          Chomsky's standard theory assumed that each sentence  exhibits  a
          surface and a deep structure. Many sentences may exhibit the same
          deep structure (e.g.,  active  and  passive  forms  of  the  same
          action).  Understanding language consists in transforming surface
          structures into deep structures.

          These transformations can be  seen  as  corresponding  to  mental
          processes,  performed  by mental modules, each independent of the
          others and each guided by elementary principles.

          Fundamental to his theory is the belief that there exist  "innate
          structures", that the ability to understand and utter language is
          due to a "universal grammar" that is common to all humans and  is
          somewhow  encoded  in the human genome. Then experience "selects"
          the specific grammar that the individual will learn.  Grammar  is
          learned not in stages, as Piaget thought, but simply by gradually
          fulfilling a blueprint that is already in the mind.

          Children do not learn, as they do not make any  effort.  Language
          "happens" to a child. The child is almost unaware of the language
          acquisition process.  Learning to speak  is  not  different  from
          growing,  maturing  and  all  the other biological processes that
          occur in a child. A child is genetically programmed  to  learn  a
          language,  and  experience  will simply determine which one.  The
          way a child is programmed is such that all  children  will  learn
          language the same way.

          Chomsky also notes how the language spoken by a  linguistic  com-
          munity is so identical to the smallest detail, even if no indivi-
          dual of the community  has  been  exposed  to  all  the  smallest
          details.

          The universal grammar is the linguistic genotype. Its  principles
          are  invariant  for  all languages. The values of some parameters
          can be "selected" by the environment out  of  all  valid  values.
          This  process  is  similar  to  what  happens  with  other growth
          processes (e.g., with the immune system).
           Chomsky  Noam:  THE  LOGICAL  STRUCTURE  OF  LINGUISTIC   THEORY
          (University of Chicago Press, 1975)

          A detailed, technical exposition of  Chomsky's  early  linguistic
          theory.
           Chomsky Noam: RULES AND REPRESENTATIONS  (Columbia  Univ  Press,
          1980)

          Chomsky defends (on philosophical and psychological grounds)  his
          position that grammars are internally represented in the mind and
          that an initial state of knowledge is shared by  all  individuals
          and then developed by social and cultural interactions.
           Chomsky Noam: THEORY OF GOVERNMENT AND BINDING (MIT Press, 1982)
           Chomsky Noam: LECTURES ON GOVERNMENT AND BINDING (Foris, 1981)

          Chomsky's hypothesis is that sound and meaning  are  mediated  by
          syntactic  representations.  A  universal grammar, an innate pro-
          perty of the human mind, defines what is a possible grammar,  and
          therefore a possible language.

          The government-binding theory puts constraints on which  features
          can  occur  in  the same rule, so that grammatical information is
          modularized  and  localized  (e.g.,  the  "projection  principle"
          states  that lexical properties must be satisfied in the same way
          at all  levels of syntactic  representation).   This  process  of
          constraining minimizes the effort required to learn a grammar (it
          limits possible rule applications).

          Universal principles of grammar limit  language-specific  options
          to a (small) set of "parameters".

          The lexicon is the repository of lexical information that  cannot
          be  predicted  from  the  universal  principlesor from choices of
          parameters.

          The final level of syntactic derivation, that of "logical  form",
          must  meet  the  "theta  criterion"  (every  theta  role  must be
          uniquely assigned).

          Every sentence has a quadruple structure: the D-structure is gen-
          erated  by  phrase-structure  rules,  the S-structure is obtained
          from the D-structure by applying transformational rules,  the  P-
          structure (a phonetical structure) and a logical form (a semantic
          component, a first-order logical representation of the  semantics
          of a natural-language sentence).

          An anaphor is bound in its local domain. A pronominal is free  in
          its   local  domain.  An  r-expression  (non-anaphoric  and  non-
          pronominal) is free.

          His 1970 X-bar theory eliminated the distinction between features
          and  categories,  and  reduced  every  expression  to  a  set  of
          features. This way verbs and nouns (e.g., "destroy" and "destruc-
          tion")  are  related in virtue of the features they share. The X-
          bar theory was made possible by the  separation  of  the  lexicon
          from the phrase structure rules (i.e., from the computation).

          The projection principle, the theta theory and the  X-bar  theory
          compose  the structure-building tools of the theory of government
          and binding.

          A universal grammar should include a number of  interacting  sub-
          systems  to deal with specific problems, such as the relations of
          anaphors to their precedents (theory of binding)  and  the  rela-
          tions between the head of a construction and categories dependent
          on it (theory of government). Other subsystems involve  determin-
          ing  thematic  roles,  assigning  abstract cases, posing locality
          conditions.

Chomsky Noam: KNOWLEDGE OF LANGUAGE (Greenwood, 1986)

          Chomsky attacks two paradoxes: how humans can know so much  given
          that they have such limited evidence; how humans can know so lit-
          tle given that they have so much  evidence.  The  problem  is  to
          determine  the  innate  endowment  that  bridges  the gap between
          experience and knowledge
           Church Alonso: CALCULI  OF  LAMBDA  CONVERSION  (Princeton  Univ
          Press, 1941)

          Church's intuition was that of determining a way to  compare  two
          functions.   A function can be defined either "intensionally", as
          the computational procedure that computes its value,  or  "exten-
          sionally",  as the set of input/output correspondences. Two func-
          tions can be compared in either of the two fashions.  To  compare
          them  "intensionally",  Church  created the "lambda" abstraction,
          which provides rules to transform any  function  in  a  canonical
          form.
           Churchland Paul: SCIENTIFIC REALISM AND THE PLASTICITY  OF  MIND
          (Cambridge Univ Press, 1979)

          The meaning of our common observations is determined not by  sen-
          sations but by a network of common beliefs.

          Churchland's attitute towards meaning is as holistic as  Quine's,
          but  Churchland interprets Quine's network of meanings as a space
          of semantic states, whose dimensions are all  observable  proper-
          ties.  Each  expression in the language is equivalent to defining
          the position of a concept within this space according to the pro-
          perties that the concept exhibits in that expression.  The seman-
          tic value of a word derives from its place in the network of  the
          language  as  a  whole.   The brain performs computations on such
          representations by means of coordinate transformations  from  one
          state space to another.

          Translation is a mapping that preserves semantic importance, that
          finds  an  intensional  structure  in the target language that is
          isomorph with the intensional structure of the  source  language.
          Unlike  Quine,  Churchland thinks that translation is possible as
          long as the two languages have isomorphic intensional structures.

Churchland Paul: MATTER AND CONSCIOUSNESS (MIT Press, 1984)

          A beginner's level introduction to the topic.
          Churchland outlines the main  areas  of  research:  what  is  the
          nature  of  mental states and processes (the ontological problem,
          or the body-mind problem), where do psychological terms get their
          meaning  (the semantical problem), are other people conscious and
          why can we only perceive our own consciousness (the epistemologi-
          cal problem), what disciplines are relevant to the study of cons-
          ciousness (the methodological problem).

          As part of the ontological problem, substance dualism  (the  mind
          is  different substance from the brain) and property dualism (the
          mind is the same substance as the brain, but comes from  a  class
          of  properties  that are esclusive of the brain) are outlined, as
          opposed to materialism (one kind of substance, one class of  pro-
          perties)  and in particular to the identity theory (mental states
          are physical states of the brain) and to functionalism (a  mental
          state  is  defined  uniquely by the causal relation it bears over
          behavior and other mental states).

          The semantical problem can be solved assuming that the meaning of
          a  psychological  term  comes either from inner ostension, opera-
          tional definition or its place in a network of laws.

Churchland Patricia: NEUROPHILOSOPHY (MIT Press, 1986)

          Churchland provides a historical  introduction  to  neuroscience,
          from  the  structure  of  the nervous system to neurology; then a
          historical introduction to the philosophy of science, from  Plato
          to Popper.

          Folk psychology is an  empirical  theory,  just  like  any  other
          scientific  theory,  except  that,  instead of numeric attitudes,
          folk psychology exhibits propositional attitudes. Folk psychology
          as  a  scientific  theory  is  incomplete (as it does not explain
          dreams, craziness and so forth), is the subset of a  theory  that
          has  already  been  falsified  (when  we  realized  that physical
          phenomena such as thunder are not due to the gods) and is  diffi-
          cult  to  integrate with other scientific theories. Given its low
          "productivity", folk psychology should be abandoned.  Terms  such
          as "belief" and "desire" are as much scientific as the four spir-
          its of alchemy.

          Churchland compares propositional attitudes  to  numerical  atti-
          tudes (belief to length, desire to velocity, fear to temperature,
          seeing to charge, suspecting to kinetic energy) and contends that
          laws  can  be made for propositional attitudes that are analogous
          to the ones for numerical attitudes.

          In the next few chapters Churchland  attacks a  number  of  argu-
          ments  against  the program of reducing mental states to physical
          states.  She criticizes the arguments of substance  and  property
          dualism.   She  examines  Nagel's  claim  that  qualia  cannot be
          reduced to neural states, Jackson's claim that sensations  cannot
          be  reduced  to brain states and Popper's claim that the world of
          mental states cannot be part of the world of physical states, and
          proves that they have no conclusive proofs for their arguments.

          Churchland is searching for a unified  theory  of  cognition  and
          neurobiology.    Churchland   believes   in  a  "co-evolutionary"
          approach to the mind, whereby cognitive psychology and  neurosci-
          ence  are complementary to each other, rather than autonomous.  A
          computational theory of the mind should be based on a  theory  of
          the  structure  of  the  brain.  The symbols of Fodor's mentalese
          should be somehow related to neurons. And abstract laws for  cog-
          nitive  processes  should  be reduced to physical laws for neural
          processes.

          The fundamental model of cognitive neurobiology (the "phase-space
          sandwich"  model)  is  a  set of interconnected sheets of neurons
          modelled on specific cerebral structures that perform by means of
          coordinate transformations (Paul Churchland).

Churchland Paul: ENGINE OF REASON (MIT Press, 1995)

          The book provides detailed description of how the brain perceives
          sensory  input (in particular vision) and relates the findings to
          artificial neural networks. THe emphasis is on the power of  sen-
          sory  representation through vector coding.  It also briefly sur-
          veys different takes on consciousness (Nagel, Jackson, Searle).

Churchman Charles: THE DESIGN OF INQUIRING SYSTEMS (Basic, 1971)

          Churchman thinks that mental development occurs  as  construction
          of  mental  models. He identifies  five "inquiring systems" (sys-
          tems to acquire knowledge): Leibniz's, or deductive; Locke's,  or
          inductive;  Kant's, or analogical; Hegel's, or dialectical (build
          hypotheses that are antithetical to  the  previous  models);  and
          Singer's metrological (that can control the previous four).

Clark Andy: MICROCOGNITION (MIT Press, 1989)

          The book provides a reasoned critique to artificial  intelligence
          and  cognitive science and a defence of parallel distributed pro-
          cessing. Clark finds clues in general considerations on  biologi-
          cal  systems,  that  fit  well in the parallel distributed model.
          Evolved creatures do not store information in a costly  way  when
          they  can  use the structure of the environment for the same pur-
          poses.  Complex biological systems have evolved  subject  to  the
          constraints  of  gradualistic  holism: the evolution of a complex
          system is possible only insofar as that system  is  the  last  or
          latest link in a chain of structures, such that at each stage the
          chain involves only a small change (gradualism)  and  each  stage
          yields a structure that is itself a viable whole (holism).

          Folk-psychological phenomena that do not seem to lend  themselves
          to  a connectionist explanation should be approached with a mixed
          model, that still uses the symbolic-processor model but always on
          top of a parallel distributed one.

Cohen Fred: IT'S ALIVE (Wiley, 1994)

          An introduction to the  history  of  how  computer  viruses  were
          created.

           Cohen Jack & Steward Ian: THE COLLAPSE OF CHAOS (Viking, 1994)

          The theme of the book is how the regularities  of  nature  emerge
          from  the  underlying  chaos  and complexity of nature: "emergent
          simplicities collapse chaos". The first  part  introduces  scien-
          tific  themes  of cosmology, quantum theory, biological evolution
          and psychology.  Consciousness and life are described as "systems
          of interactive behavior".

          Then the book emphasizes that external constraints are  fundamen-
          tal  in  shaping biological systems (DNA does not uniquely deter-
          mine an organism) and new concepts are defined: "simplexity" (the
          tendency  of  simple rules to emerge from underlying disorder and
          complexity) and "complicity" (the tendency of interacting systems
          to  coevolve  leading to a growth of complexity). Simplexity is a
          "weak" form of emergence, and  is  ubiquitous.  Complicity  is  a
          strongere form of emergence, and is responsible for consciousness
          and evolution.  Emergence is the rule, not the exception, and  it
          is shaped by simplexity and complicity. A science of emergence is
          proposed as an alternative to traditional, reductionist, science.

          A wealth of biological themes are touched  upon  along  the  way,
          from  Darwin's  natural  selection to Dawkins' selfish gene, from
          Gould's contingency to DNA, not to mention mathematical subjects,
          from fractals to information theory.

Collins Alan: THEORIES OF MEMORY (Lawrence Erlbaum, 1993)

          A collection of papers from cognitive psychologists, ranging from
          Baddeley ("working memory and conscious awareness"), D. Schacter,
          Susan Gathercole, William Hirst, Lawrence Barsalou.
           Comrie Bernard:  LANGUAGE  UNIVERSALS  AND  LINGUISTIC  TYPOLOGY
          (Univ. of Chicago Press, 1981)
          Comrie proposes a catalog of universal properties  that  seem  to
          hold for all known languages.

Conrad Michael: ADAPTABILITY (Plenum, 1983)

          Conrad's "statistical state model" of  the  evolutionary  process
          distinguishes  between adaptedness (fixed adapations) and adapta-
          bility (response to the environment's fluctuations).   Adaptabil-
          ity  is  adaptedness  to  an  ensemble of environments and can be
          decomposed into anticipation (uncertainty of behavior of the sys-
          tem  which  is  used to dissipate environmental fluctuations) and
          indifference (uncertainty of  the  environment  that  the  system
          incorporates into its behavior).  The maximum total modifiability
          (uncertainty) of a system approaches over time the average uncer-
          tainty of its environment.  Conrad then defines formally the max-
          imum total modifiability of a system An increase  in  uncertainty
          at one level of organization is compensated by changes in adapta-
          bility at some level. Levels compensate for each other's fluctua-
          tions.
           Corriveau Jean-Pierre: TIME-CONSTRAINED  MEMORY  (Lawrence  Erl-
          baum, 1995)

          A theory of grounded cognition that accounts for  the  diachronic
          (over  time  a  text  may be interpreted in different ways by the
          same reader) and non-deterministic (a text  may  or  may  be  not
          interpreted  by  a  reader)  nature of comprehension.  Linguistic
          comprehension is viewed as a time-constrained process.  Rules for
          linguistic  comprehension can be implemented by simple "knowledge
          units" that work in a very constrained amount of time.

Coveney Peter: FRONTIERS OF COMPLEXITY (Fawcett, 1995)

An accessible introduction to theories of nonlinear systems.

           Cowan Nelson: ATTENTION AND  MEMORY  (Oxford  University  Press,
          1995)

          Cowan puts forth a theory of memory  that  discriminates  between
          memory processes operating within and outside the focus of atten-
          tion.  At any time the focus of attention comprises only a subset
          of  the  information  that is currently activated.  In this model
          the role of attentional  filter  is   played  by  habituation  of
          orienting, rather than by the filter of Broadbent's model.

          Memory and attention are closely integrated.  Memory is driven by
          a  number  of  processes (encoding, activation, decay, retention,
          reactivation, context-dependent retrieval), but all of  them  are
          affected by attention. Automatic processes cannot achieve: a more
          complete encoding of the stimuli;  longer-lasting  activation;  a
          more conscious retrieval process.

          Short-term memory can be viewed as a hierarchical structure  con-
          sisting  of  all the activated portion of memory plus the portion
          that is the focus of attention.

          One of the key aspects of long-term  memory  is  the  distinction
          between  memory  stored and retrieved automatically versus memory
          stored and retrieved with the benefit of the attentional  system.
          Consciousness  is  but the phenomenological counterpart of atten-
          tion.

          Drawing from a vast literature, Cowan also tries  to  map  neural
          processes  into  hiw  own  psychological  model  of memory (e.g.,
          attention-related long-term memory may be stored with the help of
          the  hippocampus,  the  focus  of attention may be located in the
          parietal lobe, etc).

          Following Kissin, Cowan distinguishes three levels of  conscious-
          ness:  basic  alertness (mediated by signals showering the entire
          cortex); general awareness (produced by neural circuits including
          the  thalamus); and self-awareness (possibly from the integration
          of signals from various association areas).

          The book is full of reference to contemporary  research  and  can
          also  serve  as  a  guide to psychological and neurophysiological
          research projects in the field of memory.
           Cox Richard: THE ALGEBRA OF  PROBABLE  INFERENCE  (John  Hopkins
          Press, 1961)

          Unlike Savage, who built his theory of probabilities on pragmatic
          arguments  regarding  decision making, Cox attempted to develop a
          theory of probabilistic inference founded  on  axiomatic  princi-
          ples.   His  axioms refer only to abstract entities such as "evi-
          dence" and "belief". Any phenomenon  that  can  be  expressed  by
          means  of  Cox's axioms can be reduced to probabilistic calculus.
          Cox attributes nonfrequentist but  objective  interpretations  to
          prior probabilities.
           Craik Kenneth: THE NATURE OF EXPLANATION (Cambridge Univ  Press,
          1943)

          Craik was one of the first visionaries to posit  that  the  human
          brain  can be considered as a particular type of machine which is
          able to build internal models of the world, and process  them  to
          produce  action.   Craik's  improvement over Descartes' automaton
          (limited to mechanical reactions to external stimula) was consid-
          erable  because  it involved the idea of an "internal representa-
          tion" and a "symbolic processing" of such  representation.   Des-
          cartes'  automaton  had  no  need  for  knowledge  and inference.
          Craik's automaton needs knowledge and inference and the  process-
          ing  of  knowledge  is  what  yields  intelligence. Craik's ideas
          predate the theory of knowledge-based systems, Fodor's  mentalese
          and Johnson-Laird's models.

Crick Francis: LIFE ITSELF (Simon & Schuster, 1981)

          Crick examines the story of life on planet Earth and draws a  few
          unusual conclusions.

          The mind came into the picture quite  late  in  the  evolutionary
          process.  If  mind is unique to humans, then a tiny change in the
          evolutionary chain could have resulted in no humans,  and  there-
          fore no mind. Mind does not look like a momentous episode, but as
          a mere accident.

          Natural selection has the function of making unlikely events very
          common.

Crick Francis: ASTONISHING HYPOTHESIS (MacMillan, 1993)

          Crick summarizes recent developments in neurobiology  and  specu-
          lates that synchronized firing in the range of forty Hertz in the
          areas connecting the thalamus and the cortex might explain  cons-
          ciousness.   Mostly  this  book  discusses  the  neural basis for
          visual awareness.  The "astonishing  hypothesis"  is  that  cons-
          ciousness can be explained by science.
           Cronin Helena: THE AND AND  THE  PEACOCK  (Cambridge  University
          Press,1992)

          The book, written in colloquial english, focuses  on  two  contr-
          oversial  and  apparently  contradictory (in the light of natural
          selection) phenomena of biological  evolution:  sexual  selection
          and altruism.

          Darwinism solved the problem  of  "design  without  a  designer":
          variation  and  selection  alone can shape the animal world as it
          is, although variation is undirected and there is no selector for
          selection.  Implicit  in darwinism was the idea that evolution is
          due to replicators rather than organisms, that the subject of its
          theory  is hereditary units.  Natural selection is about the dif-
          ferential survival of  replicators.   Genes  can  be  replicators
          whereas  organisms, groups and other levels of the hierarchy can-
          not. Organisms are but vehicles of replicators.  Genes  are  per-
          petuated  insofar  as  they  yield phenotypes that have selective
          advantages over competing phenotypes. Organism-centered darwinism
          is but an approximation of gene-centered darwinism.

          Genes can also have phenotypic effects  that  extend  beyond  the
          bodies  that  house them: they can affect an "extended phenotype"
          (e.g., a bird's nest or a  spider's  web,  parasites,  symbiosis,
          etc).  Pleiotropy  (the  phenotypic side effects) may sometime be
          caused by adaptation  of  the  extended  phenotype  (a  parasited
          organism may exhibit an "unintended" behavior which is in reality
          part of the parasite's adaptative process).

          Cronin's "gene selectionism" argues that genes rather than organ-
          isms  (as Darwin held) are primary units of natural selection and
          shows how this view can  solve  two  notorious  problems:  sexual
          selection as displayed by the peacock and altruism as illustrated
          by the ant.

          Cronin reviews Darwin's and Wallace's debate on the  function  of
          sexual  selection. Darwin's "good taste" theory (purely aesthetic
          justification) could explain the extravagance  of  male  ornament
          but  not female choice; Wallace's "good sense" theory (search for
          optimal male) could explain female choice but not male  ornament.
          Fisher  proposed a compromise, by proving that good taste is good
          sense: choosing an attractive  male  is  adaptive  for  a  female
          because  she will have attractive offsprings (success breeds suc-
          cess).

          From the point of view of a gene, any organism carrying it is  an
          equivalent  reproductive  source. In many cases siblings are more
          closely related (genetically speaking) that parents  and  offspr-
          ings. Adaptation is for the good of the replicator. Therefore, it
          is not surprising that sometimes organisms  sacrifice  themselves
          for  improving  their  kin's survival. Kin selection is part of a
          gene reproduction strategy.

          Darwin did not solve the problem of  speciation  (the  origin  of
          species),  i.e.  how a species can split into two species. Cronin
          briefly discusses Darwin's and Wallace's positions  and  her  own
          conjectures.
           Crowder Robert: PRINCIPLES OF  LEARNING   AND  MEMORY  (Erlbaum,
          1976)

          A comprehensive  manual  of  research  on  learning  and  memory.
          Crowder  presents findings and theories about iconic memory (pre-
          categorial storage), encodindg in memory  (vision,  audition  and
          speech),  the  working  of  short-term  memory,  nonverbal memory
          (eidetic imagery), primary  memory  (consciousness),  forgetting,
          processes  of  learning  and  retrieval.  Hundreds of studies are
          mentioned and reviewed.