CCAT relevant to learning? (Fritz Lehmann)
Date: Sun, 13 Nov 94 12:40:28 CST
From: (Fritz Lehmann)
Message-id: <>
Subject: CCAT relevant to learning?
Cc:, anquetil@IRO.UMontreal.CA,,,, billrich@VNET.IBM.COM,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Precedence: bulk

Dear Norman,

     You wrote:
-------begin quote-----
Date: Tue, 18 Oct 1994 14:08:22 +1000
From: (Norman Foo)
Subject:   Ontologies Group
Status: RO
My research group is pursuing work in ontologies from the machine learning
point of view.  Is it appropriate for us to join the CCAT group?  We do
not have an interest in detailed ontologies, but only in what kinds of
situations can lead to learnable ontologies.  This is being done in a
controlled way, by assuming bounded differences between current theories
and those that have to be learned.
--------end quote----------

     This is a quite interesting question.  As you can see from our
CCAT discussions, we're concerned with the _content_ and _structure_
of the ontologies.  However, some learning issues are intimately
involved.  For example, we recognize that few areas will have
genuine "iff" definitions, rather they will have constraints.
I imagine an "upper bound" and "lower bound" for every concept
definition (a poset interval) something like those in Rough Set
theory.  The _actual_ definition is "in there", but unattainable.
Learning may be the process of refining the interval.

     Another crucially important issue is that of ontological
"dimensions"; these also occur in learning theories to justify
distance measures, and in case-based retrieval systems.  I hope
that CCAT will be able to generate a good theory (both as to
structure and as to content) on this.  The question is, how and why
are knowledge hierarchies with independent "factors" to be
distinguished from mere jumbles of logical predicates?
What features of reality generate what structural constraints
and/or factorizations of knowledge?  (A-priori synthetic stuff?)
I surmise that your "bounded difference" presupposes some
dimensions in which to measure distances; is that right?

     If a learning theory dynamically alters and adds to the
poset of categories (concepts and relations ordered by generality)
or the structure of the formal definitions, then I presume it has
to respect the global constraints on the large-scale structure
(dimensions, partition-constraints, arity-constraints, etc.).

     Another area involving learning is in integration of
differing ontologies.  Where there are no exact equivalences
between definitions of corresponding concepts, how do we
discover that they are in fact "corresponding"?  It may need
some kind of learning to relate them, or create a link of
interpolated concepts.  The learning could depend on instances
of a concept, or on analogous linkages within the hierarchies
of the two ontologies to be linked.  (This may be an alternative
to my EGG/YOLK theory.)

     If your group addresses these (or anything relating to
actual content, based on the real world) then I think CCAT
would be relevant.  I suspect this would only involve the
third tier of your three-tier approach (i.e. 1. AGM-revise, 2.
Abduct, 3. Re-ontologize).  I think levels 1 and 2 would not
be relevant to CCAT, but maybe I'm mistaken.  You'll have to
decide whether level 3 is relevant.  (Generally speaking, CCAT
needs all the help it can get!)

                          Yours truly,   Fritz Lehmann
GRANDAI Software, 4282 Sandburg Way, Irvine, CA 92715, U.S.A.
Tel:(714)-733-0566  Fax:(714)-733-0506