Tarski Alfred:  LOGIC,  SEMANTICS,  METAMATHEMATICS  (Clarendon,
          1956)

          A collection of the historical papers by  Tarski,  in  particular
          "On  the  concept  of  truth",  which advanced the correspondence
          theory of truth: a statement is true if it corresponds  to  real-
          ity.  Tarski's semantics has the goal of reducing all concepts to
          physical concepts. All semantic concepts are defined in terms  of
          truth,  and truth is defined in terms of satisfaction, and satis-
          faction is defined in terms of  physical concepts.

          Tarski created the first model theory  for  quantified  predicate
          logic.
           Taylor Charles: THE EXPLANATION OF BEHAVIOR (Routledge &  Kegan,
          1964)

          Behavior is a function  of  the  state  of  the  system  and  its
          environment; but what brings behavior about is its being required
          to achieve the goals.

Thagard Paul: MIND (MIT Press, 1996)

A clear and well-organized textbook on cognitive science.

           Thelen Esther & Smith Linda: A DYNAMIC SYSTEMS APPROACH  TO  THE
          DEVELOPMENT OF COGNITION AND ACTION (MIT Press, 1994)

          The book describes a  theory  of  early  human  development  (how
          organic form is created, where does the information for the adult
          reside, etc) that applies the theory of nonlinear dynamic systems
          to  biology  and  constitutes a landmark departure from cognitive
          theories.

          The processes that govern human development are the same that act
          on the simplest organisms (and even some nonliving systems). They
          are processes of emergent order and complexity, of how  structure
          arises  from  the  interaction  of  many  independent units. They
          integrate organic ontogeny at every  level,  from  morphology  to
          behavior.

          Drawing  from  Edelman's  neural  darwinism,  Bertalanffy's   and
          Laszlo's   general   systems  theory,  Haken's  synergetics,  and
          Waddington's organismic metaphor, the authors prove that Piaget's
          theory  fails, that Chomsky's model of competence and performance
          is flawed, that nativism is implausible, that cognition  is  con-
          tinous  across  development,  that Fodor's modules are illogical,
          that Newell & Simon's information processing model is incomplete.
          Only  connectionism  is  salvaged,  in virtue of its similarities
          with dynamic systems (knowoledge as a pattern of activity, mental
          life  as  only  processes (not structures), but then discarded as
          naif and insufficient.

          By using Robert Cairn's analogy (evolution  is  to  biology  what
          development  is to psychology, i.e. the process behind the struc-
          ture), the authors advance a theory of  development  that  is  as
          opportunistic  as  evolution.   Knowledge  in the individual ori-
          ginates  in  opportunistic  and  context-specific   psychological
          processes. The emphasis is on processes of change, on ever-active
          self-organizing processes of living systems (analogous to  selec-
          tion algorithms).

          Development  appears  to  be  orderly,  incremental,  directional
          (towards nutritional independence and reproductive maturity). The
          authors' theory, though, is that development is not driven  by  a
          grand  design:  it  is  driven  by  opportunistic,  syncretic and
          exploratory processes. At a closer look, in fact, development  is
          modular  and heterochronic (different organs develop at different
          rates and different times), although the organism progresses as a
          whole.  Global  regularities (and simplicity) somehow arises from
          local variabilities (and complexities).

          Development is not structured. Development is the outcome of  the
          interplay  between action and perception within a system that, by
          its thermodynamic nature, seeks stability.  Performance  emerges.
          Cognition  is  an emergent structure, situated and embodied, just
          like any other skill.

          Knowledge for thought and action emerges  from  the  dynamics  of
          pattern formation in the context of neural group selection.  Per-
          ception, action and cognition are rooted in the same pattern for-
          mation  processes. Categories arise (self-roganize) spontaneously
          and reflect the experiences of acting  and  perceiving,  i.e.  of
          interacting  with  the  world.   More  precisely,  categories are
          created through the cross-relation of multimodal  (hearing,  see-
          ing,  feeling,  etc) experiences.  Unity of perception and action
          is evident in category formation.  The critical  role of movement
          in development is emphasized over and over: movement is a percep-
          tual category.  Beeing in the world "selects" categories.

          "Meaning is emergent in perceiving and acting  in  specific  con-
          texts and in a history of perceiving and acting in contexts".

          Development can be  then  viewed  as  the  dymanic  selection  of
          categories.  Categories are but a specific case of pattern forma-
          tion, but they also are the foundation of cognitive  development.
          Therefore,  cognitive development is a direct consequence of pro-
          perties of nonlinear dynamic systems, of self-configuring complex
          systems.

          These features are shared by all organisms.
           Thom Rene': STRUCTURAL STABILITY  AND  MORPHOGENESIS  (Benjamin,
          1975)

          The english translation of the  seminal  1972  study  that  esta-
          blished catastrophe theory as a mathematical tool to classify the
          solutions of nonlinear systems in the neighborhood  of  stability
          breakdown. The "ensembles de catastrophes" are hypersurfaces that
          divide the parameter space in  regions  of  completely  different
          dynamics.  Therefore  dynamics and form become dual properties of
          nonlinear systems.

Tipler Frank: THE PHYSICS OF IMMORTALITY (Doubleday, 1995)

          The "omega point theory" of the universe is a  rigorous  physical
          proof  of the existence of an omnipresent, omniscient and omnipo-
          tent god and a proof of the likelihood  that  every  human  being
          will  eventually be resurrected.  The book also contains a physi-
          cal model of life in heaven, hell and  purgatory,  all  based  on
          information theory, quantum mechanics and relativistic cosmology.

          Tipler explores notions as varied as the Bekenstein  bounds  (the
          upper  limits on the number of distinct quantum states and on the
          rate that changes of state can occur, i.e.  the  upper  limit  on
          information  density),  the  Taub  universe (a universe that con-
          tracts at different rates in different directions, in  particular
          collapsing  in one direction while retaining the same size in the
          others, thus leading to an oblate spheroid sphere shape), and the
          omega  point  (the  final singularity of the history of a  closed
          universe, the point of infinite  information,  which  is  neither
          space nor time nor matter, but is beyond all of these and experi-
          ences the whole of universal history all at once); all mixed with
          Nietzsche's philosophy and Aquinas' theology.

          His basic point is that life is information coding  preserved  by
          natural selection: a being is alive if it encodes information and
          such information is preserved over  time  by  natural  selection.
          Given this definition of life, it is possible to compute how much
          energy is sufficient and necessary to extend  this  process  till
          the  very end of time. In a Taub collapsing universe, such energy
          is available for free (as a consequence of temperature difference
          in  different  directions, which becomes infinite as the universe
          approaches its singularity) and life can use it to  survive  for-
          ever.  The  finite  singularity  of the universe and eternal life
          happen to coincide.  It is the  very  collapse  of  the  universe
          (because  it  happens at different rates in different directions)
          that permits life to continue forever.  Gravitational shear (col-
          lapse  at  different rates in different directions) is implied by
          the chaotic nature of Einstein's gravitational equations.

          Tipler points out that even cars and ideas are living beings,  as
          they  encode information and they can self-replicate, albeit with
          the help of another being (a factory  or  a  mind).  But  another
          being  is often required by biological systems (many plants needs
          a bee to replicate,  males  need  females  to  replicate  and  so
          forth).

          Tipler diproves Penrose's proof that machines cannot be  intelli-
          gent. Tipler's definition of intelligence "is" the Turing test.

          Tipler also disproves theorems  of  eternal  return  in  physics:
          Einstein's  universe (closed, finite and unlimited) implied eter-
          nal progress.

Toffoli Tom: CELLULAR AUTOMATA MACHINES (MIT PRess, 1987)

           Touretzky David: THE MATHEMATICS OF INHERITANCE SYSTEMS  (Morgan
          Kaufman, 1986)

          Touretzky's inheritance theory  shows  the  similarities  between
          logical  proof  (which is a tree of formulas, with the theorem at
          the root and the axioms as the leaves) and  paths  (sequences  of
          nodes) that are explored during a search within a network.

          Touretzky argues that there is  a  natural  partial  ordering  of
          defaults in inheritance systems that is implicit in the hierarch-
          ical structure of the inheritance  graph:  the  inferential  dis-
          tance,  which determines subclass/superclass ordering (a class is
          a subclass of another class if there is an inheritance path  from
          the  former  to  the latter). Touretzky claims that default rules
          about subclasses should override default rules about the  superc-
          lasses that contain them. Subclasses override superclasses.

          The best path in a network is the one that minimizes  inferential
          distance  (as  opposed to the shortest path method of traditional
          inheritance systems, i.e., the shortest proof is not  always  the
          best proof).

Trehub Arnold: THE COGNITIVE BRAIN (MIT Press, 1991)

          Trehub offers a broad theory of how the  brain  works,  based  on
          actual neural mechanisms.

Tulving Endel: ORGANIZATION OF MEMORY (Academic Press, 1972)

          A collection of articles on memory. Tulving distinguishes between
          episodic memory (which receives and stores information about tem-
          porally dated episodes and temporal-spatial relations among them)
          and  semantic  memory  (organized  knowledge  about  the  world).
          Episodic memory is a faithful record of a person's experience.
           Tulving Endel:  ELEMENTS OF EPISODIC MEMORY (Oxford Univ  Press,
          1983)

          Tulving expands on  his  distinction  of  episodic  and  semantic
          memories.  Tulving  now  recognizes  two  higher level classes of
          memory: procedural and propositional.  Propositional  memory  can
          be subdivided in episodic and semantic memories.  A detailed con-
          ceptual framework of episodic memory is  provided,  that  details
          processes of encoding and retrieval in episodic memory.  Accessi-
          bility of a  piece  of  information  depends  on  the  conditions
          ("cues")  under which that piece of information has been learned.
          The remembering of  events  always  depends  on  the  interaction
          between  encoding and retrieval conditions (compatibility between
          the "engram" and the "cue").

          Tulving's experiments proved that  intension  and  extension  are
          dealed  with  by  two  different types of memory: episodic memory
          contains specific episodes of  the  history  of  the  individual,
          while  semantic  memory  contains general knowledge applicable to
          different situations.  A perception relates  extensional  objects
          with  intensional  concepts,  and the speech act relates concepts
          with words: between word and object there  is  only  an  indirect
          relationship.

          In a subsequent paper Tulving proposed to  distinguish  different
          memory  systems  based on the following characteristics: kinds of
          information they  process,  operations  that  can  be  performed,
          neural substrates that are affected, timing of appearance in phy-
          logenetic and ontogenetic development, and format of  representa-
          tion.  A  memory  system can therefore be defined in terms of its
          brain mechanisms, the information it processes and the principles
          of its operation.
           Turbayne Colin Murray: THE MYTH OF METAPHOR  (Yale  Univ  Press,
          1962)

          Turbayne treats metaphor not as a linguistc phenomenon, but as  a
          philosophical one.

          Descartes and Newton founded modern science on  the  basis  of  a
          metaphysics of mechanism. Turbayne presents a different metaphor:
          he treats events in nature as if they compose a language, and the
          world as a universal language.

Turing Alan Mathison: MORPHOGENESIS (North-Holland, 1992)

          A collection of historical papers by  Turing.  In  "The  chemical
          basis of morphogenesis" (1952) he advanced the reaction-diffusion
          theory of pattern formation, based on the bifurcation  properties
          of the solutions of differential equations.

          Turing devised a model to generate stable patterns:

            X catalyzes itself: X diffuses slowly
            X catalyzes Y: Y diffuses quickly
            Y inhibits X
            Y may or may not catalyze or inhibit itself

          Some reactions might be able to create  ordered  spatial  schemes
          from  disordered  schemes.  The function of genes is purely cata-
          lytic: they catalyze the production  of  new  morphogenes,  which
          will catalyze more morphogenes until eventually form emerges.

Turing Alan: PURE MATHEMATICS (Elsevier Science, 1992)

A collection of historical papers by Turing.

          In 1936 with his seminal paper "On computable numbers" Alan  Tur-
          ing  defined computation as the formal manipulation of symbols by
          the application of formal rules.

          A Turing machine is capable  of  performing  all  the  operations
          that  are  needed  to perform logical calculus: read current sym-
          bols, process them,  write  new  symbols,  examine  new  symbols.
          Depending  on  the  symbol that it is reading and on the state in
          which it is, the Turing machine decides whether  it  should  move
          on,  backwards,  write  a symbol, change state or stop.  Turing's
          machine is an automatic formal system: a system to  automatically
          compute  an  alphabet  of  symbols  according  to a finite set of
          rules.

          The universal machine is a Turing's machine capable of simulating
          all  possible  Turing's machines.  It contains a sequence of sym-
          bols that describes the specific  Turing  machine  that  must  be
          simulated. For each computational procedure the universal machine
          is capable of simulating a machine that performs that  procedure.
          The  universal machine is therefore capable of computing any com-
          putational function.

Turing Alan: MECHANICAL INTELLIGENCE (Elsevier Science, 1992)

A collection of historical papers by Turing.

          In "Computing machinery and intelligence" (1950) Turing  proposed
          a  famous test to verify whether a machine is intelligent or not:
          ask the same questions of a machine and a  human  being,  without
          being told which one is which, and if you can't tell which one is
          which, then the machine is intelligent.
           Turner Raymond: LOGICS FOR ARTIFICIAL INTELLIGENCE  (Ellis  Hor-
          wood, 1985)
          A short, but clear, introduction to  non-standard  logics:  modal
          logic,   epistemic  logic,  multi-valued  logics,  intuitionistic
          logic, theory of types, non-monotonic reasoning,  temporal  logic
          and fuzzy logic.

Turner Scott: THE CREATIVE PROCESS (Lawrence Erlbaum, 1994)

          A theory of creativity and a case-based computer prototype ("Min-
          strel")  that generates stories. Art is viewed as a problem solv-
          ing activity, and an author  as  a  problem  solver  who  employs
          knowledge  encoded  in cases. Creativity is an integrated process
          of search and adaptation guided by creativity heuristics:  it  is
          an  extension of problem solving that is driven by the failure of
          problem solving and creative alternatives are  created  by  using
          old knowledge in new ways.

          The architecture employs for classes  of  goals:  thematic  goals
          (development of the story theme, point, moral), concistency goals
          (plausibility constraints), drama goals  (artistic  quality)  and
          presentation goals (effective style).
           Turvey Michael: PERCEIVING, ACTION AND  KNOWING  (Lawrence  Erl-
          baum, 1977)

          A psychological theory of how cognition and action interact.   An
          action  can be performed in many different ways, i.e. the nervous
          system has to deal with degrees of freedom. It solves the problem
          through  a  hierarchical  command  structure.  Every level of the
          hierarchy adds detail to the overall goal of  the  action.  Lower
          levels  have  a  degree  of autonomy, higher levels exert control
          over lower  units  by  tuning  the  parameters  that  define  the
          features of the lower units and by tuning the pathways connecting
          them.
           Tversky Amos, Kahnemann Daniel &  Slovic  Paul:  JUDGMENT  UNDER
          UNCERTAINTY (Cambridge University Press, 1982)

          A collection of essays on heuristics and biases, as introduced by
          Tversky.   The  fundamental  assumption  is that people rely on a
          limited set of heuristic principles  which  greatly  reduces  the
          task  of  assessing probabilities: representativeness (the degree
          to which an event is representative of a class of events),  avai-
          lability (the degree to which past occurrences of an event can be
          brought to mind) and adjustment (the degree to which the  initial
          approximate  value  must  be  changed). Representativeness can be
          viewed as "connotative" distance, availability can be  viewed  as
          "associative" distance.

          People employ heuristics to answer questions such as: what is the
          probability  that  an  object  belongs  to a given class? that an
          event originates from a given event? that a process will generate
          a  given event? Heuristics that affect the decision include prior
          probabilities of outcome, sample size, predictability;  but  they
          are not reflected in the theory of probability.

          At the same  time,  deviations  of  subjective  probability  from
          objective  probability are systematic. Experiments show that peo-
          ple predict by similarity (representativeness). Experiments  also
          show that causal inferences have greater efficacy than diagnostic
          inferences.

          Tversky criticizes probabilistic reasoning as a way  to  describe
          human  thinking  as it is subject to "framing effects". Tversky &
          Shafer offered a  "constructivist"  theory  of  probabilities  in
          which probabilities describe an ideal situation that can still be
          related to the real situation.
           Tye Michael:  THE  METAPHYSICS  OF  MIND  (Cambridge  University
          Press, 1989)

          There are no mental events (beliefs or  desires)  and  no  mental
          objects  (such  as pain or images). Drawing from Sellar's "adver-
          bial" theory of sensing, Tye develops his own  "operator"  theory
          in  which   sensory adverbs are analyzable as predicate operators
          added to a standard predicate calculus.

          Tye thinks that the phenomenal aspects of experience ("what it is
          like") are unrelated to their representational contents.
           Tye Michael: THE IMAGERY DEBATE (MIT Press, 1991)

          Tye proposes a unified theory of  mental  imagery  that  embraces
          both  the the visual stance and the linguistic stance, that tries
          to bridge Stephen Kosslyn's  pictorialism  and  Zenon  Pylyshyn's
          descriptionalism  (the  two  main  opposite schools of thought on
          what kind of representational  structures  images  exactly  are).
          Tye  believes  that  the  experimental  evidence supports a mixed
          theory of pictorialism and descriptionalism.

          The book also provides a comprehensive introduction to  the  his-
          tory of the debate, from Aristotle to Kosslyn, Pylyshyn, Marr and
          Hinton.

Tye Michael: TEN PROBLEMS OF CONSCIOUSNESS (MIT Press, 1995)

          Tye dramatically changes his theory of the mind,  admitting  that
          phenomenal  aspects  of mental life are representational and that
          they are not ti be found in neural events.

          First, Tye lists ten problems of phenomenal  consciousness,  such
          as  ownership  (feelings are private to an individual, i.e.  "why
          can't anyebody else feel my feelings?") and perspectival  subjec-
          tivity  (feelings  can be understood only by individuals who have
          felt them), and more traditional issues such  as  duplicates  and
          inverted qualia.

          Then he develops a theory of the mind that solves  all  problems:
          phenomenal  states  are  both  perspectival  and  physical.   All
          experiences have representational content,  not  just  perceptual
          ones.   For  example,  emotions  are  sensory  representations of
          bodily changes.  Sensory states represent  external  features  in
          the sense that they track the presence of those features ("causal
          covariation theory").

          All experiences and  all  feelings  represent  things  and  their
          phenomenal  aspects  are  to  be understood in terms of what they
          represent.

Ulanowicz Robert: GROWTH AND DEVELOPMENT (Springer-Verlag, 1986)

          In order to explain growth and development (not only of individu-
          als,  but  also of ecosystems, societies and economies), the book
          introduces  a  new  thermodynamic  quantity:  "ascendency"  is  a
          phenomenological  measure  increasing  with ecological succession
          and decreasing in stressed ecosystems.  Ascendency  reflects  the
          ability of a system to prevail against other configurations.

          The second law of thermodynamics is interpreted  as  stating  the
          impossibility,  for any ecosystem component, to convert its ener-
          getic and material inputs into ordered biomass, i.e. that a frac-
          tion  of  it  is always dissipated.  Ecological systems are none-
          quilibrium systems.  Prigogine's theorem is invoked to show  that
          near  equilibrium  forces and flows of a steady state system tend
          to minimize entropy production.

          Growth and development are formalized through the concept of net-
          works  of  (energetic  and  material)  flows.  This formalization
          applies to all levels of the biological hierarchy, from cells  to
          biosphere,  and  even  to nonbiological systems. Network of flows
          can be reduced to elements of linear algebra and  a  calculus  of
          measured  flows can be reduced to information theory, once infor-
          mation is defined as the magnitude  of  decrease  in  uncertainty
          (uncertainty  being  the logarithm of the probability of the out-
          come).

          The process of growth and development is eventually summarized in
          a variational principle (ascendency is maximized subject to a set
          of conservation constraints).  Such principle of "optimal  ascen-
          dency"  specifies  the  influence  of  higher-scale events on the
          lower levels of the hierarchy.  Fitness must be redefined as  the
          ability  of  organisms  to play a coherent role in the network of
          ecological processes.
           Ullman Shimon: THE INTERPRETATION OF VISUAL MOTION  (MIT  Press,
          1979)

          A computational theory of how the sensory input of the  eye  ori-
          ginates  a  representation  of  the  environment  in  the mind is
          divided into two problems: the "correspondence"  problem  (recog-
          nizing  a  piece  of the image as an individual object in motion)
          and the 3-d interpretation problem.  The  former  is  solved   by
          reducing  the  image  to a set of tokens and applying similarity-
          based reasoning to them. The latter is solved by the interplay of
          a  "structure  from motion" process and a "motion from structure"
          process.
           Underwood Geoffrey: ASPECTS OF  CONSCIOUSNESS  (Academic  Press,
          1982)

          A monumental (three volumes) collection of articles on conscious-
          ness written by psychologists.
           Unger Peter: IDENTITY,  CONSCIOUSNESS  AND  VALUE  (Oxford  Univ
          Press, 1991)

          Unger indulges in all sorts of thought experiments about personal
          identity.   What  happens i a brain is replaced or exchanged? Can
          one person fade into another?

          Survival of personal identity over time requires continous physi-
          cal  realization  of  it by  a physically continous succession of
          realizers beginning with the  current  one.  Physical  continuity
          entails  properties  such as gradual replacement of matter (e.g.,
          most cells in the body are continously  replaced)  and  constitu-
          tional cohesion (adhesion of the parts, at the smallest level).
           Valiant Leslie: CIRCUITS OF THE MIND (Oxford  University  Press,
          1994)

          Valiant's "neuroidal model" attempts to explain the brain's  pro-
          digious  capability  to store and process information by assuming
          that neurons and neural connections have  an  internal  structure
          which  matters.  Each neuroid is a linear threshold element, aug-
          mented with states and a timing mechanism (to  reflect  the  syn-
          chronized rhythmic behavior of the cortex).  Valiant assumes that
          a cognitive substrate made of a few elementary  functions  drives
          the  neuroidal  net.  When one of these functions is activated by
          interaction with the environment, a neural circuit  is  modified,
          and  such  a  change  contributes  to  successive  action  in the
          environment. A detailed computational model is worked out.

          Valiant starts by providing  neurobiological  details  about  the
          neocortex

Van Benthem Johan: THE LOGIC OF TIME (Kluwer, 1991)

          In this 1983 essay Van Benthem believes that time should  not  be
          studied  only  with  either  the  instant-based  ontology  or the
          interval-based ontology, but that both should be used at the same
          time.  Ontological plurality is necessary to couple any theory of
          time with theories of other domains.

          The second part of the book deals with temporal discourse.

          The book covers all the main formal approaches to time in a  very
          technical and comprehensive manner.
           Van Benthem Johan: A MANUAL OF INTENSIONAL LOGIC (Univ  Of  Chi-
          cago Press, 1988)

          Intensional logic is useful for  semantically  explaining  inten-
          sional  contexts  in natural language through multiple reference.
          Intensional logic provides tools such as tense, modality and con-
          ditionals.  Formal descriptions are given of applications such as
          temporal logic and intuitionistic logic.

Van Benthem Johan: LANGUAGE IN ACTION (MIT Press, 1995)

          A lucid treatise on the logical foundations of categorial grammar
          that  covers a broad spectrum, from lambda calculus to the theory
          of types, from proof theory to model theory.  In the last chapter
          the author advances the concept of a "logic of information", with
          a modal logic of information patterns (to deal  with  the  static
          structure of information representation) and a relational algebra
          of control (to deal with the  dynamic  structure  of  information
          processing)  and  a  type-theoretic dynamic logic that integrates
          the two aspects.
           Varela Francisco, Thompson Evan & Rosch  Eleanor:  THE  EMBODIED
          MIND (MIT Press, 1991)

          Following  Merleau-Ponty's  philosophical  thought,  the  authors
          argue  in  favor  of  a  stance that views the human body both as
          matter and as experience, both  as  a  biological  entity  and  a
          phenomenological entity.  Drawing inspiration from buddhist medi-
          tative practice, they tackle the nonunified character of the self
          and  propose  an  "enactive"  approach to cognition: cognition as
          embodied action (or enaction), evolution not as  optimal  adapta-
          tion  but  as  "natural  drift".  In the context of emergence and
          self-organization, the book finds  scientific  evidence  for  the
          emergent formation of direct experience without the need to posit
          the existence of a self. The mind is selfless. "Self" refers to a
          set  of  mental  and  bodily formations that are linked by causal
          coherence over time.  The self as the homunculus inside our  head
          is  an  illusion.  At the same time the world is not a given, but
          reflects the actions in which we engage, i.e.   it  is  "enacted"
          from  our  actions  (or  structural coupling) and filtered by our
          common sense.

          Organisms do  not  adapt  to  a  pregiven  world.  Organisms  and
          environment  mutually  specify each other. Organisms drift natur-
          ally in the environment.  Environmental regularities  arise  from
          the  interaction  between  a living organism and its environment.
          The world of an organism is enacted by the history of its  struc-
          tural  coupling with the environment.  Perception is perceptually
          guided action  (sensorimotor  enactment).   Cognitive  structures
          emerge from the recurrent sensorimotor activity that enables such
          a process.  Perceptually guided action is constrained by the need
          to  preserve  the  integrity  of  the organism (ontogeny) and its
          lineage (phylogeny).
           Varela Francisco: PRINCIPLES OF BIOLOGICAL AUTONOMY (North  Hol-
          land, 1979)

          The book merges the themes of autonomy of natural  systems  (i.e.
          internal  regulation,  as  opposed to control) and their informa-
          tional abilities (i.e., cognition) into the  theme  of  a  system
          possessing  an  identity  and  interacting  with  the rest of the
          world.

          The organization of a system is the set of relations that  define
          it  as  a  unity.  The structure of a system the set of relations
          among its components.  The organization of a system  is  indepen-
          dent  of the properties of its components. A machine can be real-
          ized by many sets of components and relations among them. Homeos-
          tatic systems are systems that keep the values of their variables
          within a small range of values, i.e. whose organization makes all
          feedback internal to them. An autopoietic system is a homeostatic
          system that continously generates its own organization  (by  con-
          tinously producing components that are capable of reproducing the
          organization that created them). Autopoietic systems turn out  to
          be autonomous, have an identity, are unities, and they compensate
          external perturbations with internal structural  changes.  Living
          systems  are  autopoietic  systems in the physical space. The two
          main  features  of  living  systems  follow  from   this:   self-
          reproduction can only occur in autopoietic systems, and evolution
          is a direct consequence of self-reproduction.

          Every autonomous system  is  organizationally  closed  (they  are
          defined  as  a unity by their organization). An autonomous system
          cannot be  described  without  describing  its  observer.  Varela
          presents  a computational framework (the calculus of indications)
          within which features of processes of systems (such  as  distinc-
          tion,  whereby  unities  are  differentiated, recursion and self-
          reference, whereby unities are constructed) can be formalized.  A
          unity becomes specified through operations of distinction (neces-
          sary conditions on the relations  among  its  components)  by  an
          observer in the tradition.  The input/output paradigm is replaced
          by a circular paradigm, which follows from the closure thesis.

          The structure constitutes the system and determines its  behavior
          in  the  environment;  therefore,  information  is  a  structural
          aspect, not a semantic one (there is no need for a representation
          of  information).  Information  is  "codependent".  Mechanisms of
          informatin and mechanisms of identity  are  dual.  The  cognitive
          domain  of an autonomous system is the domain of interaction that
          it can enter without loss of closure. An autonomous  unit  always
          exhibits two aspects: it specifies the distinction between itself
          and not-itself, and deals with its  environment  in  a  cognitive
          fashion.  Every autonomous system (ecosystems, societies, brains,
          conversations) is a "mind" (in the sense of cognitive processes).

Von Bertalanffy Ludwig: GENERAL SYSTEMS THEORY (Braziller, 1968)

          A textbook, introduction and history (by the inventor himself) to
          the  discipline of general systems, which emerged out of the need
          to explain phenomena in a variety of fields and out of  the  need
          to  provide  a unified view on all types of systems. General sys-
          tems theory was born before cybernetics, and  cybernetic  systems
          are merely a special case of self-organizing systems.

          The  classical  approach  to  the  scientific  description  of  a
          system's  behavior  can be summarized as the search for "isolable
          causal trains" and reduction to atomic units.  This  approach  is
          feasible  under two conditions: 1. that the interaction among the
          parts of the system be negligible and 2.  that  the  behavior  of
          parts  be linear. Von Bertalanffy's "systems", on the other hand,
          are those entities ("organized  complexities")  that  consist  of
          interacting  parts,  usually described by a set of nonlinear dif-
          ferential equations.  Systems  theory  studies  principles  which
          apply  to  all  systems,  properties that apply to any entity qua
          system. Alternatives to system theory include compartment  theory
          (which  views a system as a set of units upon which boundary con-
          ditions and transport processes bear), set theory, graph  theory,
          information   theory,  automata  theory,  game  theory,  decision
          theory, etc.

          Basic concepts of systems theory are introduced: every  whole  is
          based  upon the competition among its parts; individuality is the
          result of a never-ending process  of  progressive  centralization
          whereby certain parts gain a dominant role over the others;

          General systems theory looks for laws that can be  applied  to  a
          variety  of fields (isomorphisms of lawin different fields), par-
          ticularly in the biological, social and  economic  sciences  (but
          even history and politics).

          A subset of general systems theory is  open  systems  theory.   A
          change  in entropy in closed systems is always positive: order is
          continually destroyed.  In  open  systems,  on  the  other  hand,
          entropy  production  due to irreversible processes is balanced by
          import of negative entropy (as in all living  organisms).  If  an
          organism  is viewed as an open system in a steady state, a theory
          of organismic processes can be worked out.

          Even better, a living organism can be viewed  as  a  hierarchical
          order  of  open systems, where each level maintains its structure
          thanks to continuous change  of  components  at  the  next  lower
          level.  Living organisms maintain themselves in spite of continu-
          ous irreversible processes and even proceed  towards  higher  and
          higher degrees of order.

          The author also examines Whorf's hypothesis and the relativity of
          categories  (which  are  assumed to depend on both biological and
          cultural factors)
           Von Neumann John: THE COMPUTER AND THE BRAIN (Yale  Univ  Press,
          1958)

          Von Neumann describes the neural  system  of  the  brain  from  a
          mathematical  point  of  view, i.e. viewed as an automaton, using
          techniques and concepts of the digital computer.
           Von Neumann John: THEORY OF SELF-REPRODUCING AUTOMATA (Princeton
          Univ Press, 1947)

          In this postomous book Von  Neumann  explores  the  idea  that  a
          machine could be programmed to make a copy of itself.

          Life is a particular class of automata. Life's main  property  is
          the  ability  to reproduce. Von Neumann's automaton was conceived
          to absorb matter from the environment and  process  it  to  build
          another   automaton,  including  a  description  of  itself.  Von
          Neumann's idea of the dual genetics of self-reproducing  automata
          (that  the  genetic code must act as instructions on how to build
          and organism and as data to be passed on to  the  offspring)  was
          basically  the  idea  behind what will be called DNA: DNA encodes
          tha instructions for making all the enzymes and the protein  that
          a  cell  needs  to  function and DNA makes a copy of itself every
          time the cell divides in two.

          Von Neumann indirectly understood other properties of  life:  the
          ability  to  increase  its  complexity  (an organism can generate
          organisms that are more complex than itself) and the  ability  to
          self-organize.

          When a machine (e.g., an assembly line)  builds  another  machine
          (e.g.,  an  appliance), there occurs a degradation of complexity,
          whereas the offsprings of living organisms are at least  as  com-
          plex  as  their  parents and their complexity increases in evolu-
          tionary times. A self-reproducing machine is a machine that  pro-
          duces another machine of equal of higher complexity.

          By representing an organism as a group of  contigous  multi-state
          cells (either empty or containing a component) in a 2-dimensional
          matrix, Von Neumann proved that a Turing-type  machine  that  can
          reproduce itself could be simulated by using a 29-state cell com-
          ponent.

          Turing proved that there exists a  universal  computing  machine.
          Von  Neumann  proved  that  there  exists  a  universal computing
          machine which, given a description of  an  automaton,  will  con-
          struct  a  copy  of  it,  and,  by extension, that there exists a
          universal computing machine  which,  given  a  description  of  a
          universal computing machine, will construct a copy of it, and, by
          extension, that there exists a universal computing machine which,
          given a description of itself, will construct a copy of itself.
           Vosniadou Stella & Ortony Andrew: SIMILARITY AND ANALOGICAL REA-
          SONING (Cambridge University Press, 1989)

          A collection of papers from a workshop.

          Lance Rips focuses on the  distinction  between  deep  (based  on
          underlying  properties) and perceptual (surface) similarity. Rips
          opposes famile resemblance models of categorization with a  model
          of inference to the best explanation.

          Lawrence Barsalou emphasizes the  instability  of  concepts  that
          affects   both   intra-category  and  inter-category  similarity.
          Ryszard Michalsky presents a theory of concept definition whereby
          concept  meaning  is defined by a base concept representation and
          an inferential concept interpretation.

          Analogical  reasoning  is  discussed  by  Dedre  Gentner,   whose
          structure-mapping  process  relies  on  relational  commonalities
          rather than mere similarities of features.  three principles:

Vygotsky Lev: MIND IN SOCIETY (Harvard Univ Press, 1968)

          Vygotsky thinks that higher mental functions have social origins.
          Language  is a system of signs that the individual needs in order
          to interact with  the  environment  and  only  afterwards  it  is
          interiorized  and can be utilized to express thought. The meaning
          of a word is initially a purely emotional fact.  Only  with  time
          it  will  acquire  a  precise  reference to an object and then an
          abstract meaning.

          Child development is a  sequence  of  stages  that  lead  to  the
          transformation  of an interpersonal process into an intrapersonal
          process.

          Children think by memorizing, while adults memorize by  thinking.
          In  children  something  is  memorized,  in adults the individual
          memorizes something.  In  the  former  case  a  link  is  created
          because  of  the  simultaneous  occurrence of two stimuli. In the
          latter case the individual creates  that  link.   Remembering  is
          transformed  into  an external activity.  Humans are then able to
          influence their relation with the environment  and  through  that
          environment  change  their  own behavior. The mastering of nature
          and the mastering of behavior are interdependent.

Vygotsky Lev: THOUGHT AND LANGUAGE (MIT Press, 1964)

          Thought and speech have different roots and only later  in  onto-
          genesis  they  get entwined. The relationship between thought and
          speech varies therefore at different developmental stages of  the
          child.
           Waddington C.H.: PRINCIPLES OF DEVELOPMENT  AND  DIFFERENTIATION
          (Macmillan, 1966)

          By analyzing the processes of  differentiation  in  time  (histo-
          genesis), in space (regionalization) and in shape (morphogenesis)
          during embryo development,  Waddington  argues  that  development
          must   be   genetically   determined,  as  a  ball  rolling  into
          progressively-deepening valleys as time  progresses.   Once  they
          start,  developmental  processes  become more and more stable and
          more and more differentiated.  The "epigenetic landscape" depicts
          the process of "canalization" (increasing differentiation of tis-
          sues and organs during embryogenesis).
           Wagman Morton: COGNITIVE SCIENCE AND CONCEPTS OF MIND  (Praeger,
          1991)

          A general introduction to the themes of  artificial  intelligence
          and  cognitive  science, from Turing's test to problem solving in
          production systems, from conceptual dependency systems to  learn-
          ing  systems. Each historical system/project of artificial intel-
          ligence (BORIS, CYRUS, ACT, LEX, AM, BACON) is briefly described,
          together with its cognitive implications.

Waldrop Mitchell: MAN-MADE MINDS (Walker, 1987)

          An accessible introduction to the ideas, the history and the sys-
          tems of artificial intelligence.

Waldrop Mitchell: COMPLEXITY (Simon & Schuster, 1992)

          Complexity is presented as a discipline that can unify  the  laws
          of  physical, chemical, biological, social and economic phenomena
          through the simple principle that all things in nature are driven
          to  organize themselves into patterns. The book, written in plain
          english, focuses on the Santa Fe` Institute  school  of  thought.
          Lots of biographies and a history of the field.

Waltz David: SEMANTIC STRUCTURES (Lawrence Erlbaum, 1989)

          A collection of articles on natural language processing.  Michael
          Dyer  discusses  BORIS, a system for story understanding based on
          Schank's conceptual dependency.  Wendy  Lehnert  discusses  "plot
          units" for discourse analysis.
           Way Ellen Cornell: KNOWLEDGE REPRESENTATION AND METAPHOR (Kluwer
          Academic, 1991)

          Metaphor is the essence of our ability to represent the world, to
          assimilate new knowledge into the old.  Metaphor is better suited
          than logic to represent knowledge.

          Still, metaphor presents a number of  obvious  problems:  how  to
          determine its truth value (literally, metaphors are almost always
          false) and how to recognize an expression as  a  metaphor  (meta-
          phors have no consistent syntactic form).

          Way claims that literal  language  is  not  context-free  either.
          Literal  and  figurative  language  are  both  context-dependent.
          Figurative cannot be reduced to literal, because literal  is  not
          primitive  either.   What determines literal or figurative speech
          is the intent of the speaker to select a  particular  perspective
          of  a  type  hierarchy and how the concepts which are employed in
          the speech relate to how they are located in the hierarchy.

          The perspective intended by the speaker is revealed by  the  con-
          text,  which is represented by a "mask" on the type hierarchy. If
          the perspective invoked by the context complies with the classif-
          ication of natural kinds, speech is literal.

          Sentences translate into conceptual graphs, and conceptual graphs
          relate  the  concepts  of  the sentence to a type hierarchy.  The
          meaning of a concept is a partial function of its location  in  a
          type hierarchy.

          The type hierarchy changes dynamically because of  the  continous
          change in cultural and social conventions.

          Way's formalism is based on Sowa's conceptual graphs, modified so
          that  more  perspectives  ("masks")  are possible. Way's model of
          metaphor is  based  on  Black's  interactionist  model  (metaphor
          involves  a  transfer  of knowledge and actually creates similar-
          ity).

Webelhuth Gert: GOVERNMENT & BINDING THEORY (MIT Press, 1995)

          A  collection  of  articles  by  authoritative  researchers   who
          describe  different approaches to equip Chomsky's universal gram-
          mar with constraints.  The editor surveys progress  made  in  the
          field since its invention.

          The other articles provide an updated view on  current  research.
          Drawing  from  Fillmore's  cases and Gruber's thematic relations,
          Edwin Williams discusses "theta theory" (the theory  of  thematic
          roles  with respect to a predicate, or theta roles).  James Huang
          examines the relationship between syntax  (linguistic  form)  and
          semantics (logical form).
           Weber Bruce, Depew David & Smith James: ENTROPY, INFORMATION AND
          EVOLUTION (MIT Press, 1988)

          The thesis of this book is that biological phenomena are governed
          by  laws  that  are purely physical.  Evolutionary change results
          from the interplay of two elementary and  independent  processes:
          genetic  variation  and differential reproduction (natural selec-
          tion).

          A number of essays provide historical surveys  of  nonequilibrium
          thermodynamics applied to evolutionary and ecological topics.

          By focusing on entropy, structure and information, the essays  of
          this  book shed some light on the relationship between cosmologi-
          cal evolution and biological evolution. Thanks to the  advent  of
          non-equilibrium  thermodynamics,  it  is  now  possible to bridge
          thermodynamics and evolutionary biology.  This step  might  prove
          as  powerful  as  the synthetic theory of evolution, which merged
          the Mendelian genetics (a theory of inheritance) and evolutionary
          biology (a theory of species).

          Equilibrium is the state of maximum entropy: uniform  temperature
          and  maximum  disorder.  Entropy  is a measure of disorder and it
          decreases with time, according to the second law of  thermodynam-
          ics.

          Steven Frautschi points out that there is a striking  parallelism
          between the evolution of the expanding universe and the evolution
          of life on earth: because life on earth has a steady free  energy
          source (the sun), it does not need to come to equilibrium and may
          even evolve away from it (as it did when it created more and more
          complex  beings,  such  as ourselves); because the universe has a
          steady free energy source (the uniform expansion itself), it does
          not  need to come to equilibrium and may even evolve away from it
          (as it did when it  created  more  and  more  complex  clumps  of
          matter,  such  as  galaxies).   Biological evolution and universe
          evolution are consequences of nonequilibrium processes.

          Dilip Kondepudi analyzes Louis Pasteur's  discovery  that  living
          systems  prefer molecules with a certain handedness (all proteins
          are made of L-amicoacids and  genetic  material  is  made  of  D-
          sugars),  actually  that  this  molecular  asymmetry  is the only
          difference between the chemistry of the living and  of  the  dead
          matter.  By  looking  for  the  origins of biomolecular chirality
          (i.e., of chiral-symmetry breaking in chemical systems), he finds
          similarities  with  parity  violation  in  weak  interactions and
          posits a fundamental asymmetry of the universe.

          Lionel Johnson thinks that emergent properties of biological sys-
          tems reflect a response both to the physical environment in which
          the systems are currently existing and to the  changing  environ-
          ments  in which they have existed over the course of evolutionary
          time. Emergent properties include that: diversity increases  over
          time  (i.e.,  the  number of species existing in the world during
          any one time period has increased over evolutionary time), diver-
          sity  increases from the poles to the equator, complexity of evo-
          lutionary lines increases over time, the production/biomass ratio
          (a  measure of the rate of energy flow through an ecosystem rela-
          tive to the energy accumulated in the biomass, i.e. a measure  of
          the  rate  at which new material must be produced to replace that
          lost through natural death, i.e. a measure of the rate of  energy
          dissipation,  i.e.  a  measure of the rate of entropy production)
          declines over time.  Johnson defines diversity in a fashion simi-
          lar to Shannon-Weaver's definition of information, which is simi-
          lar to Boltzmann's definition of entropy.

          Eric Schneider shows that the initial stages of  ecological  suc-
          cession  are  involved  in growth and maximization of free energy
          and structure (Lotka's power law) while later stages involve  the
          development  of  complexity and efficiency, which in turn require
          minimization of entropy production.

          Lionel Harrison suggests that increases of biological  order  can
          be  explained  in terms of kinetic theory as the result of diffu-
          sion and self-catalysis.

          Depew and Weber survey the problems encountered by neo-darwinism:
          the  relation  with  theories  of the origin of life, the complex
          structure of the genome, the punctuated pattern of the archeolog-
          ical record, etc.
           Weinberg Steven: DREAMS OF A FINAL THEORY (Pantheon, 1993)
          Weinberg, a theoretical physicist who was awarded the Nobel prize
          for  the  unification  of  the  electromagnetic  and weak forces,
          believes that a unified theory of all theories exists that  would
          explain  the behavior of all animate and inanimate systems in the
          universe. Such a "grand grand" unification  theory  should  arise
          from  today's  theories of elementary particles, and from quantum
          theory in particular. Weinberg discusses  at  length  the  super-
          string theories as the first step towards such a unification pro-
          cess.  Weinberg does not seem to consider the mind a system worth
          of  studying,  therefore  he  never  mentions  the  discrepancies
          between today's Physics and the disciplines that study the  mind.
          The  reader  is left with the feeling that, if such a grand-grand
          unification theory is possible, it is highly unlikely that a phy-
          sicist will ever discovered it, even by mistake.  In his previous
          book, "The First Three Minutes", Weinberg stated: "The  more  the
          universe  seems  comprehensible,  the more it seems pointless". I
          would suggest that he replaces the word "it" with the word  "Phy-
          sics".
           Weld Daniel & DeKleer Johan: QUALITATIVE REASONING ABOUT  PHYSI-
          CAL SYSTEMS (Morgan Kaufman, 1990)

          A collection of seminal papers on  qualitative  reasoning,  which
          follows  Daniel Bobrow's book with the same title: a general sur-
          vey of the state of the art by Ken Forbus, Pat Hayes' new updated
          "naive physics manifesto", Johan DeKleer's "A qualitative physics
          based on confluences", Ken Forbus' "Qualitative  process  theory"
          and  Benjamin  Kuipers'  "Qualitative  simulation".   Each of the
          classical papers is revised and followed by an update  that  pro-
          vides more details.

          Also includes Brian Williams' "Temporal qualitative analysis" and
          James  Allen's  "Maintaining knowledge about temporal intervals",
          which provide techniques for reasoning about events taking  place
          over time.

          Boi Faltings introduces a graph of places  that  share  important
          features.   For examples, places where parts touch each other are
          more relevant to the development of the world. Common sense  per-
          ceives the world as connections between its parts.

Wellman Henry: THE CHILD'S THEORY OF MIND (MIT Press, 1990)

          Human knowledge is organized around naive theories that encompass
          specific  domains.  Such  theories  provide constraints for daily
          actions. One such theory is the theory of the mind (of the mental
          world  of thoughts, beliefs, fantasies, reasoning, etc). The book
          analyzes how children develop a commonsense understanding of  the
          mind.
           Wexler Ken & Culicover  Peter:  FORMAL  PRINCIPLES  OF  LANGUAGE
          ACQUISITION (MIT Press, 1980)

          A study of "learnability" (the process by which a child learns  a
          natural  language  when placed in the appropriate environment) in
          the context of  Chomsky's  theory  (that  the  child  has  innate
          universale  principles,  or  a  "universal  grammar",  with  open
          "parameters" that are set by experience).
           Whorf Benjamin Lee: LANGUAGE, THOUGHT AND  REALITY  (MIT  Press,
          1956)

          A collection of essays by Whorf.

          All higher thinking is dependent upon language.  Language  influ-
          ences  thought  because  it  contains  a hidden metaphysics.  The
          structure of the language influences the way its speakers  under-
          stand the environment.

          Whorf formulated the principle of linguistic  determinism:  gram-
          matical  and  categorial  patterns  of  language  embody cultural
          models. Language contains an implicit classification  of  experi-
          ence,  and  the  language  system as a whole contains an implicit
          world view. Every language is a culturally determined  system  of
          patterns  that  creates  the  categories by which individuals not
          only communicate but also think.  Language  therefore  influences
          thinking.
           Wicken  Jeffrey:  EVOLUTION,  INFORMATION   AND   THERMODYNAMICS
          (Oxford Univ Press, 1987)

          Wicken thinks that the most general entities subject  to  natural
          selection  are neither genes nor populations but information pat-
          terns  of   thermodynamic   flows,   such   as   ecosystems   and
          socioeconomic  systems.   Natural  selection  is  not an external
          force, but an  internal  process  such  that  macromolecules  are
          accrued  in  proportion to their usefulness for the efficiency of
          the global system.

          Wicken distinguishes between order (a statistical concept  refer-
          ring  to  the  regularity  in a sequence) and organization (which
          involves  spatio-temporal  and  functional  relationships   among
          parts). Thermodynamics can only account for for the generation of
          structural complexity, but not for functional organization.

          Wicken proposes a generalized Lotka law: for any evolving  system
          strategies that focus resources into the system while stabilizing
          its energetic interconnections will be preferred. Such a  process
          increases   biomass/throughput   ratios  and  decreases  specific
          entropy production.

          Wicken aims at bridging Darwin and Boltzmann by showing that  the
          thermodynamic  forces  underlying the principles of variation and
          selection begin their operation in prebiotic evolution  and  lead
          to  the  emergence  and development of individual, ecological and
          socioeconomic life. The prebiosphere is treated as a  nonisolated
          closed system in which energy sources create steady thermodynamic
          cycles. Some of this energy is captured  and  dissipated  through
          the  formation  of  ever  more  complex chemical structures. Soon
          autocatalytic systems capable  of  reproduction  appears.  Living
          systems are but "informed autocatalytic systems".

Wiener Norbert: CYBERNETICS (John Wiley, 1948)

          This is the book that launched a formal  study  of  "intelligent"
          machines.   Wiener  recognized the importance of feedback for any
          meaningful behavior in the environment: a system that has to  act
          on  the  environment must be able to continously compare its per-
          formed action with the intended action and then  infer  the  next
          action   from  their  difference. Feedback is crucial for homeos-
          tasis, which is crucial for survival.

          Wiener emphasized that communication in nature is never  perfect:
          every  message  carries  some involuntary "noise" and in order to
          understand  the  communication  the  original  message  must   be
          restored.  This leads to a statistical theory of amount of infor-
          mation. A theory of information turns out to be  the  dual  of  a
          theory of entropy, another statistical concept: if information is
          a measure of order, entropy is a measure of disorder.

          Wiener understood the essential unity of  communication,  control
          and  statistical  mechanics, which is the same whether the system
          is an artificial system or a biological  system.  This  unitarian
          field became "cybernetics".

          The second edition, in 1961, added a chapter on  self-reproducing
          machines and one on self-organizing systems.
           Wierzbicka  Anna:  SEMANTICS,  CULTURE,  AND  COGNITION  (Oxford
          University Press, 1992)

          Language is not just a tool for  communication,  but  a  tool  to
          express  meaning.  To what extent meaning is language-independent
          depends on to what extent is is innate and to what extent  it  is
          shaped  by culture.  Meaning can be transferred from one language
          to another to some degree, but not fully.  There  exist  a  broad
          variety  of  semantic  differences among languages (even emotions
          seem to be cultural artefacts), but  a  few  semantic  primitives
          have  been proposed. Such universal semantic primitives make up a
          semantic metalanguage that could be used to explicate  all  other
          concepts in all languages.

          Wierzbicka therefore explores the languages of the world for  the
          building blocks of emotions, moral concepts, names, etc.

Wierzbicka Anna: THE SEMANTICS OF GRAMMAR (Benjamins, 1988)

          Language is a tool to communicate meaning, semantics is the study
          of  meaning  encoded in language, syntax is a piece of semantics.
          Corresponding to the three types of tools employed by language to
          convey  meaning (words, grammatical constructions and illocution-
          ary devices), linguistics can be divided  in  lexical  semantics,
          grammatical  semantics  and illocutionary semantics. The division
          in syntax, semantics and pragmatics makes no sense because  every
          element  and  aspect  of language carries meaning.  Meaning is an
          individual's interpretation of the world. It  is  subjective  and
          depends  on the social and cultural context. Therefore, semantics
          encompasses lexicon, grammar and illocutionary structure.

          Grammatical semantics is  divided  in  semantics  of  syntax  and
          semantics of morphology. A metalanguage is defined to express the
          meaning of an expression.

          Wierzbicka also proves that constructions peculiar to a  language
          embody  a  view  of  the  world  specific  to the culture of that
          language. Therefore, she argues for an "ethno-syntax".
           Wilensky Robert: PLANNING  AND  UNDERSTANDING  (Addison  Wesley,
          1983)

          A pragmatic essay  on  planning  techniques  applied  to  natural
          language understanding.
           Wilks York: THEORETICAL ISSUES  IN  NATURAL  LANGUAGE  (Lawrence
          Erlbaum, 1989)

          A collection of articles on techniques for natural language  pro-
          cessing,  including  connectionist  models,  discourse theory and
          approaches to metaphor.
          Wilks discusses his  "preference  semantics",  which  expouses  a
          constraint-based  approach.  Natural language understanding comes
          from the  integration  of  language  constraints  (syntactic  and
          semantic)  with  context  contraints.   One type of semantic con-
          straint is "preferences". Similar to Schanks' expectations,  they
          restrict   the  selection  of  senses  of  lexical  entities.  In
          preference semantics each sense of a  word  is  associated  to  a
          structured  semantic  formula.  During parsing formulas are bound
          together into templates and syntax plays a minor role.   Semantic
          deviance considers a metaphor as a violation of restriction rules
          within a context. Metaphors are intentionally ungrammatical.
           Williams George: ADAPTATION  AND  NATURAL  SELECTION  (Princeton
          University Press, 1966)

          Williams' principle of parsimony requires that biological adapta-
          tions  be explained at the lowest level possible. Therefore, Wil-
          liams treats the gene as the fundamental unit of replication. The
          most  fundamental consequence of selection is differential repli-
          cation of genes.

          The gene is selected through an interaction with the  environment
          at  different  environmental  levels.  At  the  genetic level the
          environment is the population gene pool, i.e.  the  other  genes.
          The  somatic  level  in an intermediary level that has to do with
          the succession of somatic stagesin which a gene expresses itself:
          its  selection  depends  on its mean success at different stages.
          The ecological level is the environment, which can be  viewed  as
          the strategy employed by nature against the organism. The concept
          of fitness is appropriate at all epigenetic levels.

          By analyzing a number of cases of supposed group selection,  Wil-
          liams  proves  that  group selection is not a significant factor.
          Natural selection originates from reproductive competition  among
          individuals,  and  ultimately  genes.  A  gene is selected on the
          basis of its ability in producing individuals that  can  maximize
          the gene's representation in future generations.

          Organisms are built according to a design carried out  by  genes,
          which are potentially immortal.

Wilson Edward Osborne: SOCIOBIOLOGY (Belknap, 1975)

          A general and monumental introduction to sciobiology, the discip-
          line that studies the biological basis of social behavior.

          Wilson thinks that evolutionary theory can illuminate the  social
          behavior  of animals and humans. Apparently altruism is detrimen-
          tal to personal fitness, but it  evolved by natural selection for
          a  utilitaristic  reason:  altruism helps genes as a global pool,
          even if at the expense of the survival of a specific  individual.
          Altruism  is  just  another  step,  beyond  personal survival and
          reproduction, in the program to proliferate maximally  the  genes
          of an organism.

          An organism is a mere gene-transporting device: its primary func-
          tion  is  not  even  to reproduce itself, but to reproduce genes.
          The mind itself is engineered to perpetuate DNA.  The brain is  a
          machine whose goal is to maximize fitness in its environment.

          All aspects of social behavior are defined formally. For example,
          Wilson interprets communication as the process that makes it pos-
          sible for the behavior of an animal to influence the behavior  of
          another  animal.   The  biological  functions and the origins (in
          ritualization) of communication are discussed at length.   Causes
          and  effects  of  changes in social behavior are analyzed drawing
          from a multitude of examples.

          The ultimate determinants of social organization are phylogenetic
          inertia  (the  set  of properties shared by a population that fix
          the extent to which its evolution can  be  deflected  in  another
          direction  and  the  amount  by  which  its evolution rate can be
          altered) and ecological pressure (the set  of  all  environmental
          factors that operate on the population).

          A central tenet of sociobiology is that all aspects of human cul-
          ture  and behavior are coded in the genes and have been molded by
          natural selection.  Wilson is after a biological explanation  for
          everything:  religion,  ethics, and ultimately for the history of
          mankind.  His  program  is  to  identify  universals   in   human
          societies,  e.g. define human nature; the assume that the univer-
          sals are coded in the human genotype; and  that  universals  have
          been selected by evolution.
           Wilson Edward Osborne: THE DIVERSITY OF LIFE (Harvard University
          Press, 1992)

          Diversity is crucial to the existence of life as it  is.  At  the
          same  time it is fascinating that so much diversity is created in
          the biosphere. Diversity is the key to  survival  of  the  larger
          organisms,  the  ones at the top of the energy and biomass pyram-
          ids. The origin of biological diversity is a side product of evo-
          lution,  which is made of two parallel processes, one of vertical
          change in the original  population  and  one,  dependent  on  the
          former, of splitting of the original population (speciation). The
          origin of species (which Darwin did not explain) is  due  to  the
          evolution of structural differences (hereditary isolating mechan-
          isms) that prevents the production  of  fertile  hybrids  between
          populations.  Such differences arise as traits that adapt species
          to the environment.  Two basic levels in biological diversity can
          be  identifies:  genetic variation within species and differences
          among species.

          By surveying adaptive radiation (the spread of species of  common
          ancestry into different niches) and evolutionary convergence (the
          occupation of the same niche by outcomes  of  different  adaptive
          radiations), Wilson proves that oportunity causes an explosion of
          species formation.

          The book, written in colloquial english, is an  excellent  intro-
          duction  to  modern  themes  of evolutionary biology. Rather than
          offering a textbook view of firm theories, it  continously  shows
          the  limits  of  our current knowledge.  The last part deals with
          ethical issues.
           Wilson Edward & Lumsden Charles: GENES, MIND AND  CULTURE  (Har-
          vard Univ Press, 1981)

          Wilson mellows down his original stand on  the  mind  as  a  mere
          gene-transporting  machine and attempts a unified theory of biol-
          ogy and social sciences, from genes to mind to culture,  positing
          a strong coupling between genetic and cultural evolution.

          Culture is the product of the interaction of all the  mental  and
          physical  artifacts  of  a population. Human culture is a form of
          "euculture", which involves reification (the production  of  con-
          cepts  and the continous reclassification of the world, including
          the ability to symbolize) besides teaching, imitation and  learn-
          ing  (which  are  present  in  many other animals).  A culture is
          perceived through its "culturgens" (behaviors and artifacts), the
          basic  units  of inheritance in cultural evolution.  Each indivi-
          dual is genetically endowed with epigenetic rules to process cul-
          turgens.  These  rules assembly the mind of the individual.  They
          include genetically  determined  sensory  filters  and  cognitive
          faculties, and affect the probability of transmitting a culturgen
          as opposed to another.  Epigenesis is defined as the  process  of
          interaction between genes and the environment during development.
          Epigenetic rules affect  both primary functions such  as  hearing
          and  secondary functions such as mother-infant bonding and incest
          avoidance.

          One or more culturgens  are  favored  by  the  epigenetic  rules.
          Eucultural species evolve towards a type of cultural transmission
          in which a dual shift  occurs  in  time  ("gene-culture  coevolu-
          tion"): change in the epigenetic rules due to shifts in the genes
          frequency and change in culturgen frequencies  due  to  the  epi-
          genetic rules.  The two shifts exhert a mutual influence.

          The epigenetic rules exhibit genetic variation,  thereby  contri-
          buting  to  the variance of cognitive traits within a population.
          The fitness of the individuals differ depending on  their  minds'
          behaviors.  Therefore  the  population  as a whole tends to shift
          towards the most efficient epigenetic rules.

          The general model is one in which the offspring learn to "social-
          ize"  from  their  age peers and parent generation. They evaluate
          the culturgens and assimilate them depending on their  epigenetic
          rules; and then use the outcome to exploit the environment.

          The authors review evidence from  daily  habits  that  suggest  a
          relation between genes and culture.
           Winograd Terry: LANGUAGE AS A COGNITIVE PROCESS (Addison Wesley,
          1983)

          A textbook for natural language  processing:  grammars,  parsing,
          transformations,  ATNs, case grammar, lexical-functional grammar,
          generalized phrase-structure grammar. Techniques are detailed for
          computer implementation.
           Winograd Terry: UNDERSTANDING NATURAL LANGUAGE (Academic  Press,
          1972)

          A description and discussion of a natural lnaguage  understanding
          program  (SHRDLU)  based on an integrated model of syntax, seman-
          tics and inference and applied to the blocks world.
           Winograd Terry & Flores Fernando:  UNDERSTANDING  COMPUTERS  AND
          COGNITION (Ablex, 1986)

          Drawing from Heidegger's phenomenology and  Maturana's  cognitive
          biology,   Winograd  denies  that  intelligence  can  be  due  to
          processes of the type of production systems,  i.e.  to  the  sys-
          tematic  manipulation  of  representations.   Intelligent systems
          act, don't think. They think  when  action  does  not  yield  the
          desired result. Only then do they decompose the situation and try
          to infer action from knowledge.

          In language the role of the listener is emphasized for the active
          generation  of  meaning.  Language  is ultimately based on social
          interactions, as proved by the speech act theory  of  Austin  and
          Searle.

          The book concludes that the program  of  Artificial  Intelligence
          must  be changed to view the computer merely as a tool to improve
          the life of humans.

Winson Jonathan: BRAIN AND PSYCHE (Anchor Press, 1985)

          Winson believes in a connection  between  the  neurophysiological
          processes of the brain (specifically, of the hippocampus) and the
          unconscious, which lends Freud's psychoanalytical  theories  bio-
          logical plausibility.

          Dreams are the bridge between the conscious and the  unconscious.
          There  is  a biologically relevant reason to dream: a dream is an
          ordered processing of memory which interprets experience that  is
          precious  for survival. Dreaming is a sort of off-line processing
          essential to learning.  The Freudian subconscious is  the  phylo-
          genetically  ancient  mechanisms  involving  REM  sleep, in which
          memories and strategies are formed in the prefrontal cortex.

Winston Patrick: ARTIFICIAL INTELLIGENCE (Addison Wesley, 1993)

          Third edition of one of  the  earliest  textbooks  on  artificial
          intelligence.
           Wittgenstein Ludwig:  PHILOSOPHICAL  INVESTIGATIONS  (Macmillan,
          1953)

          One of the milestone books of modern philosophy,  it  contains  a
          wealth of ideas.

          Foremost is the theory of family resemblance.   A  category  like
          "game" does not fit the classical idea of categories being closed
          by clear boundaries and defined by  common  properties  of  their
          members.   What  unites  the category is family resemblance, plus
          sets of positive and negative examples;  and  boundaries  may  be
          extended at any time.

          About language in generale, Wittgenstein argues  that  to  under-
          stand  a  word  is  to  understand a language and to understand a
          language is to master the linguistic skills.

          Wittgenstein systematically demolishes all pre-existing  theories
          of  meaning.   In particular, he abandons Frege's notion of sense
          (and any intensionalist notion of sense).
           Wolfram Stephen:  CELLULAR  AUTOMATA  AND  COMPLEXITY  (Addison-
          Wesley, 1994)

          A collection of papers by Wolfram from 1982 to 1986. A number  of
          studies  present  a general mathematical model for cellular auto-
          mata viewed as discrete self-organizing dynamical  systems.  They
          can  be organized in four classes, which behave respectively like
          limit points, limit cycles, chaotic attractors and universal com-
          putating  machine. Their evolution is almost always irreversible.
          Entropies and Lyapunov exponents measure the information  content
          and rate of information transmission in cellular automata.

Wood Mary McGee: CATEGORIAL GRAMMARS (Routledge, 1993)

          A short, compact but very technical manual  that  summarizes  the
          state of the art in categorial grammars.

          Categorial  grammars,  which  originated  from   the   logic   of
          Adjukiewicz  (1935)  and the algebraic calculus of Joachim Lambek
          (1958), represent semantics directly in syntax. Categorial  gram-
          mars  represent a refinement of phrase-structure grammars as they
          assign an internal structure to category  symbols.   The  set  of
          categories  is  defined  recursively:  if X and Y are categories,
          then any function from X into Y is also a category.

          The book sketches the history of the field,  from  Bar-Hillel  to
          Montague.   The various types of categorial grammars, from Lambek
          calculus to more complex variants, are introduced.
           Woods William: SEMANTICS FOR A QUESTION-ANSWERING  SYSTEM  (Gar-
          land, 1967)

          This question-answering system employed the  first  computational
          model  for natural-language semantic interpretation. It defined a
          procedural semantics and introduced the ATN grammar.
           Wright Larry:  TELEOLOGICAL  EXPLANATIONS  (Univ  of  California
          Press, 1976)

          This "etiological analysis of  goals  and  functions"  employs  a
          slight  variation  of  Charles Taylor's definition of behavior (a
          goal-directed function  of  the  state  of  the  system  and  the
          environment).  Wright thinks that any feature of a species exists
          because it was needed to overcome natural selection. Evolution is
          the  fundamental  criterion  to  determine the function of a pro-
          perty.
           Yager Ronald & Zadeh  Lotfi:  AN  INTRODUCTION  TO  FUZZY  LOGIC
          APPLICATIONS (Kluwier Academic, 1992)

          A collection of articles that mainly deals  with  knowledge-based
          applications of fuzzy logic.

Zadeh Lotfi: FUZZY LOGIC APPLICATIONS (Kluwer Academics, 1992)

          Zadeh's 1965 article  "Fuzzy  Sets"  applied  Lukasiewicz's  mul-
          tivalued  logic to sets, so that elements belong to a set to dif-
          ferent degrees.
          In classical logic inference is performed  symbolically,  regard-
          less  of  the meaning of  the formulae. In fuzzy logic statements
          are translated into elastic constraints and the meaning  is  com-
          puter directly via nonlinear techniques.

          The degree of truth is a measure of the coherence between a  pro-
          position  about the world and the state of the world. The meaning
          of a proposition is the constraint  that  limits  (explicitly  or
          implicitly) the values of the variables in that proposition.

          Zadeh defines a procedure to compute the meaning, i.e. that  con-
          straint, through non-linear programming techniques. A proposition
          can be true, false, partially known or vague  with  a  degree  of
          vagueness.

          Zadeh's theory of fuzzy quantities assumes that  things  are  not
          necessarily  true  or  false,  but  things have degrees of truth.
          Fuzzy logic is a multi-valued logic that extends classical logic.

          Fuzzy logic can explain paradoxes such as the one about  removing
          a  grain  of sand from a pile of sand (when does the pile of sand
          stop being a pile of sand?). In fuzzy logic each  application  of
          the inference rule erodes the truth of the resulting proposition.

          As for Duhem's principle of incompatibility, the certainty that a
          proposition is true decreases with any increase of its precision.

          A fuzzy set is a set of elements that belong to  a  set  only  to
          some  extent.   Each  element  is  characterized  by  a degree of
          membership.  An object can belong (partially) to  more  than  one
          set, even if they are mutually exclusive.  Each set can be subset
          of another set with a degree of membership.   A  set  can  belong
          (partially) to one of its parts.

          A distribution of possibilities (relative to a variable) projects
          the  universe of discouse (relative to that variable) in the con-
          tinous unitary interval.  The  distribution   specifies  what  is
          epistemically possible, i.e. the values admissable for that vari-
          able. The value of the distribution for a  term  T  of  discourse
          expresses  the  degree  of  preference  that is attributed to the
          expression "the value of the variable is T", i.e. the  degree  of
          possibility of T for that variable.
           Zeki Semir: A VISION OF THE BRAIN (Blackwell, 1993)
          An investigation of the  visual  cortex  from  a  neurobiological
          viewpoint lead Zeki to argue that perception and comprehension of
          the world occur simultaneously thanks to  reentrant  (reciprocal)
          connections  between all the specialized areas of the visual cor-
          tex.  Since the  visual cortex constitutes a large  part  of  the
          cerebral  cortex,  the same properties are likely to hold for the
          rest of the cortex.  It appears then that  the  function  of  the
          sensory parts of the visual cortex is to categorize environmental
          stimuli.

          The brain copes with a continually changing environment by focus-
          ing  on  a  few  unchanging characteristics of objects out of the
          numberless ever-changing bits of  information  that  it  receives
          from  those  objects.  The  brain basically is programmed to make
          itself as independent as possible from world changes.  The  brain
          cannot  simply  absorb  information from the environment, it must
          process it to extract those constant features that represent  the
          physical essence of objects.

Zimmermann Hans: FUZZY SET THEORY (Kluwer Academics, 1985)

          A thorough introduction to the theory of fuzzy sets.  The  second
          part deals with applications in several fields.