+ All Categories
Home > Documents > what is ontology

what is ontology

Date post: 08-Apr-2018
Category:
Upload: norashikin-ahmad
View: 220 times
Download: 0 times
Share this document with a friend

of 7

Transcript
  • 8/7/2019 what is ontology

    1/7

    B. Ch andra seka ran and John R. Josephson, Ohio State Unive rsityV. Richard Benjamins, University of Amsterdam

    EORIES IN AI FALL INTO TWObroad categories: mechanism theories andcontent theories. Ontologies are content the-ories about the sorts of objects, properties ofobjects, and relations between objects that arepossible in a specified domain of knowledge.They provide potential terms for describingour knowledge about the domain.In this article, we survey the recent devel-opment of the field of ontologies in AI. Wepoint to the somewhat different roles ontolo-gies play in information systems, natural-language understanding, and knowledge-based systems. Most research on ontologiesfocuses on what one might characterize asdomain factual knowledge, because knowl-ede of that type is particularly useful in nat-ural-language understanding. There is an-other class of ontologies that are importantin KBS-one that helps in sharing know-eldge about reasoning strategies or problem-solving methods. In a follow-up article, wewill focus on method ontologies.

    Ontology as vocabularyIn philosophy, ontology is the study of thekinds of things that exist. It is often said thatontologies carve the world at its joints. In

    AI, the term ontology has largely come to

    THISSURVEYP R O V I D E S A CONCEPTUAL INTRODUCTIONTO ONTOLOGlES AN D THEIR ROLE IN LVFORMATlON

    SYSTEMS AN D Al. THEAUTHORS ALSO DlSCUSS HO WO N T O L O G I E S CLARIFY THE DOMAINS STRUCTURE OF

    KNOWLEDGE AND ENABLE KNOWLEDGE SHARlNG.

    mean one of two related things. First of all,ontology is a representationvocabulary, oftenspecialized to some domain or subject matter.More precisely, it is not the vocabulary as suchthat qualifies as an ontology, but the concep-tualizations that the terms in the vocabularyare intended to capture. Thus, translating theterms in an ontology from one language toanother, for example from English to French,does not change the ontology conceptually.Inengineering design, yo u might discuss theontology of an electronic-devices domain,which might include vocabulary that describesconceptual elements-transistors, operationalamplifiers, and voltages-and the relationsbetween these elements-operational ampli-fiers are a type-of electronic device, and tran-sistors are a component-ofoperational ampli-fiers. Identifying such vocabulary-and theunderlying conceptualizations-generally

    20 1094-7167/99/$10.000 1999 IEEE

    requires careful analysis of the kinds of objectsand relations that can exist in the domain.In its second sense, the term ontology issometimes used to refer to a body of knowl-

    edge describing some domain, typically acommonsense knowledge domain, using arepresentation vocabulary. For example,CYC often refers to its knowledge repre-sentation of some area of knowledge as itsontology.In other words, the representation vocab-ulary provides a set of terms with which todescribe the facts in some domain, while thebody of knowledge using that vocabulary isa collection of facts about a domain. How-ever, this distinction is not as clear as it mightfirst appear. In the electronic-device exam-ple, that transistor is a component-of opera-tional amplifier or that the latter is a type-ofelectronic device is just as much a fact about

    IEEE INTELLIGENT SYSTEMS

  • 8/7/2019 what is ontology

    2/7

    its domain as a CYC fact about some aspectof space, time, or numbers. The distinctionis that the former emphasizes the use ofontology as a set of terms for representingspecific facts in an instance of the domain,while the latter emphasizes the view of ontol-ogy as a general set of facts to be shared.

    There continues to be inconsistencies inthe usage of the term ontology. At times, the-orists use the singular term to refer to a spe-cific set of terms meant to describe the entityand relation-types in some domain. Thus, wemight speak of an ontology for liquids orfor parts and wholes. Here, the singularterm stands for the entire set of concepts andterms needed to speak about phenomenainvolving liquids and parts and wholes.When different theorists make different pro-posals for an ontology or when we speak aboutontology proposals for different domains ofknowledge, we would then use the plural termontologies to refer to them collectively.In AIand information-systems literature, however,there seems to be inconsistency: sometimes wesee references to ontologyof domain andother times to ontologies of domain, bothreferring to the set of conceptualizations forthe domain. The former is more consistent withthe original (and current) usage in philosophy.

    Ontology as content theoryThe current interest in ontologies is the lat-

    est version of AIs alternation of focus be-tween content theories and mechanism the-ories. Sometimes, the AI community getsexcited by some mechanism such as rule sys-tems, frame languages, neural nets, fuzzylogic, constraint propagation, o r unification.The mechanisms are proposed as the secretof making intelligent machines. At othertimes, we realize that, however wonderful themechanism, it cannot do much without agood content theory of the domain on whichit is to work. Moreover, we often recognizethat once a good content theory is available,many different mechanisms might be usedequally well to implement effective systems,all using essentially the same content.2AI researchers have made several attemptsto characterize the essence of what it meansto have a content theory. McCarthy andHayes theory (epistemic versus heuristic dis-t in~tion) ,~Marrs three-level theory (infor-mation processing, strategy level, algorithmsand data structures level, and physical mech-anisms level): and Newells theory (Knowl-JANUARY /FEBRUARY 1999

    edge Level versus Symbol Level)5 all grap-ple in their own ways with characterizingcontent. Ontologies are quintessentially con-tent theories, because their main contributionis to identify specific classes of objects andrelations that exist in some domain. Ofcourse, content theories need a representa-tion language. Thus far, predicate calculus-like formalisms, augmented with type-ofrelations (that can be used to induce classhierarchies), have been most often used todescribe the ontologies themselves.

    Why are ontologiesimportant?Ontological analysis clarifies the structure

    of knowledge. Given a domain, its ontologyforms the heart of any system of knowledgerepresentation for that domain. Withoutontologies, or the conceptualizations thatunderlie knowledge, there cannot be a vocab-ulary for representing knowledge. Thus, thefirst step in devising an effective knowledge-representation system, and vocabulary, is toperform an effective ontological analysis ofthe field, or domain. Weak analyses lead toincoherent knowledge bases.

    An example of why performing goodanalysis is necessary comes from the field ofdatabases6Consider a domain having sev-eral classes of people (for example, students,professors, employees, females, and males).This study first examined the way this data-base would be commonly organized: stu-dents, employees, professors, males, andfemale would be represented as types-of theclass humans. However, some of the prob-lems that exist with this ontology are that stu-dents can also be employees at times and canalso stop being students. Further analysisshowed that the terms students and employeedo not describe categories of humans, but areroles that humans can play, while terms suchasfemales andmales more appropriately rep-resent subcategories of humans. Therefore,clarifying the terminology enables the ontol-ogy to work for coherent and cohesive rea-soning purposes.Second, ontologies enable knowledgesharing. Suppose we perform an analysis andamve at a satisfactory set of conceptualiza-tions, and their representativeterms, for somearea of knowledge-for example, the elec-tronic-devices domain. The resulting ontol-ogy would likely include domain-specific

    terms such as transistors and diodes; generalterms such as functions, causal processes,and modes;and terms that describe behaviorsuch as voltage. The ontology captures theintrinsic conceptual structure of the domain.In order to build a knowledge representationlanguage based on the analysis, we need toassociate terms with the concepts and rela-tions in the ontology and devise a syntax forencoding knowledge in terms of the conceptsand relations. We can share this knowledgerepresentation language with others whohave similar needs for knowledge represen-tation in that domain, thereby eliminating theneed for replicating the knowledge-analysisprocess. Shared ontologies can thus form thebasis for domain-specific knowledge-repre-sentation languages. In contrast to the previ-ou s generation of knowledge-representationlanguages (such as KL-One), these lan-guages are content-rich; they have a largenumber of terms that embody a complex con-tent theory of the domain.Shared ontologies let us build specificknowledge bases that describe specific situ-ations. For example, different electronic-devices manufacturers can use a commonvocabulary and syntax to build catalogs thatdescribe their products. Then the manufac-turers could share the catalogs and use themin automated design systems. This kind ofsharing vastly increases the potential forknowledge reuse.

    Describing the worldWe can use the terms provided by the

    domain ontology to assert specific proposi-tions about a domain or a situation in adomain. For example, in the electronic-device domain, we can represent a fact abouta specific circuit: circuit 35 has transistor 22as a component, where circuit 35 is aninstance of the concept circuit and transistor22 is an instance of the concept transistor.Once we have the basis for representingpropositions, we can also represent knowl-edge involving propositional attitudes (suchas hypothesize, believe, expect, hope , desire,and f ear ) . Propositional attitude terms takepropositions as arguments. Continuing withthe electronic-device domain, we can assert,for example: the diagnostician hypothesizesor believes that part 2 is broken, or thedesigner expects or desires that the powerplant has an output of 20 megawatts. Thus,an ontology can represent beliefs, goals,

    21

  • 8/7/2019 what is ontology

    3/7

    ~~

    On the one hand there are entities, such as processes and events, which have temporalparts....On the other hand there are entities, such as material objects, which are always pre-sent in their entirety at any time at which they exist at all. The categorical distinctionbetweenentities which do, and entities which do not have temporal parts is grounded in commonsense.Yet various philosophers have been inclined to oppose it. Some ... have defendedanontology consisting exclusivelyof things with no temporal parts. Whiteheadians have favoredontologies including only temporally extended processes. Quine has endorsed a four-dimen-sional ontology in which the distinction between objects and processes vanishes and everyentity comprises simply the content of some arbitrarily demarcated portionof space-time.One further option, embraced by philosophers such as David Lewis, accepts the oppositionbetween objects and processes, while still finding a way to allow that all entities have bothspatial and temporal parts.

    hypotheses, and predictions about a domain,in addition to simple facts. The ontology alsoplaysa role in describing such things as plansand activities, because these also requirespecification of world objects and relations.Propositional attitude terms are also part ofa larger ontology of the world, useful espe-cially in describing the activities and prop-erties of the special class of objects in theworld called intensional entities-forexample, agents such as humans who havemental states.

    Constructing ontologies is an ongoingresearch enterprise. Ontologies range inabstraction, from very general terms thatform the foundation for knowledge repre-sentation in all domains, to terms that arerestricted to specificknowledge domains. Forexample,space, time,parts, and subparts areterms that apply to all domains; malfunctionapplies to engineering or biological domains;and hepatitis applies only to medicine.Even in cases where a task might seem tobe quite domain-specific, knowledge repre-sentation might call for an ontology that des-cribes knowledge at higher levels of gener-ality. For example, solving problems in thedomain of turbines might require knowledgeexpressed using domain-general terms suchasflowsand causality. Such general-leveldescriptive terms are called the upper ontol-og y or top-level ontology. There are manyopen research issues about the correct waysto analyze knowledge at the upper level. Toprovide some idea of the issues involved,Figure 1excerpts a quote from a recent callfor papers.

    Today, ontology has grown beyond philos-ophy and now has many connections to infor-mation technology. Thus, research on ontol-ogy in AI and information systems has had toproduce pragmatically useful proposals fortop-level ontology. The organizationof a top-level ontology contains a number of problems,similar to the problems that surround ontol-

    Figure 1. Call for papers for a special issue on temporal parts for TheMonist, An Infernofiono/(luorfer/yJourno/ofGenerolPhi/osophico/Inquiry.This quote suggests that on tology has always been an issue of deep concern in philoso.phy and th at the issues continue to occupy tontemporary philosophers.ogy in philosophy. For example,many ontolo-gies have thing or entity as their root class.However, Figure 2illustrates that thing andentity start to diverge at the next level.For example, CYCs thing has the subcat-egories individual object, intangib le,and rep-resented thing; the Generalized UpperModels (GUM) um-thing has the subcate-gories configuration,element, and sequence;Wordnets* thing has the subcategories liv-ing thing and nonliving thing, and Sowasroot T has the subcategories concrete, pro-cess, object, and abstract. (Natalya FridmanNoys and Carol Hafners article discussesthese differences more fully?) Some of thesedifferences arise because not all of theseontologies are intended to be general-pur-pose tools, or even explicitly to be ontolo-gies. Another reason for the differences isthat, in principle, there are many differenttaxonomies.

    Although differences exist within ontolo-gies, general agreement exists between on-tologies on many issues:* There are objects in the world.

    Objects haveproperties or attributesthatcan take values.Objects can exist in various relationswitheach other.Properties and relations can change overtime.There are events that occur at differenttime instants.There areprocesses in which objects par-ticipate and that occur over time.The world and its objects can be in dif-ferent states.Events can cause other events or states aseffects.Objects can have parts.

    The representational repertoire of objects,relations, states, events, and processes doesnot say anything about which classes of these

    entities exist. The modeler of the domainsmakes these commitments.As we move froman ontologys top to lower taxonomic levels,commitments specific to domains and phe-nomena appear. For modeling objects onearth, we can make certain commitments.Forexample, animals, minerals, and plants aresubcategoriesof objects; has-life(x)and con-tains-carbon(x)are object properties; andcan-eat(x,y ) is a possible relation betweenany two objects. These commitments are spe-cific to objects and phenomena in this do-main. Further, the commitments are not arbi-trary. For them to be useful, they shouldreflect some underlying reality.There is no sharp division between do-main-independent and domain-specific on-tologies for representing knowledge. Forexample, the terms object, physical o bject,device, engine,and diesel engine all describeobjects, but in an order of increasing domainspecificity. Similarly, terms for relationsbetween objects can span a range of speci-ficity, such as connected, electrically-con-nected, and soldered-to.Subtypesofconcepts.Ontologies generallyappear as a taxonomic tree of conceptual-izations, from very general and domain-independent at the top levels to increasinglydomain-specific further down in the hierar-chy. We mentioned earlier that differentontologiespropose different subtypesof evenvery general concepts. This is because, as arule, different sets of subcategorieswill resultfrom different criteria for categorization. Two,among many, alternate subcategorizations ofthe general concept object are physical andabstract, and living and non-living. In somecultures and languages, words for objectshave gender, thus creating another top-levelclassification along the gender axis. We caneasily think of additional subcategorizationsbased on other criteria. The existenceof alter-nate categorizationsonly becomes more acuteas we begin to model specific domains ofknowledge. For example, we can subcatego-rize causal process into continuous and dis-crete causal processes along the dimensionof how time is represented, and into mechan-ical, chemical, biological, cognitive, andsocial processes along the dimension of thekinds of objects and relations involved in thedescription.In principle, the number of classificationcriteria and distinct subtypes is unlimited,because the number of possible dimensionsalong which to develop subcategories can-

    22 IEEE INTELLIGENT SYST EMS

  • 8/7/2019 what is ontology

    4/7

    not be exhaustively specified. Often, this factis not obvious in general-purposeontologies,because the top levels of such ontologiescommit to the, most commonly useful sub-types. However, domain-specific ontologiescan contain categorizationsalong dimensionsthat are usually outside the general ontology.Task dependenceof ontologies.How task-dependent are ontologies? Presumably, thekinds of things that actually exist do notdepend on our goals. In that sense, ontologiesare not task-dependent. On the other hand,what aspects of reality are chosen for encod-ing in an ontology does depend on the task.For example, in the domain of fruits, wewould focus onparticular aspects of reality ifwe were developing the ontology for theselection of pesticides; we would focus onother aspects of reality if we were develop-ing an ontology to help chefs select fruits forcooking. In ontologies for engineering appli-cations, categorizing causal processes intothose that do, and that do not, produce dan-gerous side effects might be useful. Designengineers and safety analysts might find thisa very useful categorization, though it isunlikely to be part of a general-purposeontol-ogys view of the causal process concept.Practically speaking, an ontology is un-likely to cover all possible potential uses. Inthat sense, both anontology for a domain anda knowledgebase written using that ontologyare likely to be more appropriate for certainuses than others and unlikely to be sharableacross wide& divergent tasks. This is,by now,a truism in KBS research and is the basicinsight that led to the current focus on the rela-tionship between tasks and knowledge types.Presuppositionsor requirements can be asso-ciated with problem-solvingmethods for dif-ferent tasksso that they can capture explicitlythe way in which ontologies are task-depen-dent. For example, a method might have a pre-supposition (or assumptionlo) stating that itworks correctly only if the ontology allowsmodeling causal processes discretely. There-fore, assumptionsare a key factor in practicalsharing of ontologies.

    Technologyfor ontologysharing

    There have been several recent attempts tocreate engineeringframeworksfor construct-ing ontologies. Michael R. Genesereth and

    CYC

    Individual object Intangible Represented

    Wordnet

    Liv ing Nonl iv ing

    GU M Um-Thing

    Conf igurat ion Element Sequence

    Sowas

    Concrete&Process Object Abstract~~~~

    Figure 2. Illustration of how ontologies differ in their analyses of the most general concepts.

    Richard E. Fikes describe KIF (KnowledgeInterchangeFormat), an enabling technologythat facilitates expressing domain factualknowledge using a formalism based on aug-mented predicate calculus. Robert Nechesandhiscolleagues describe a knowledge-shar-ing initiative,12while Thomas R. Gruber hasproposed a language called Ontolingua to helpconstruct portable ontologies.l 3 In Europe, theCommonKADS project has taken a similarapproach to modeling domain kn ~wl ed ge. ~

    These languages use varieties of predicatecalculus as the basic formalism. Predicatecalculus facilitates the representation ofobjects, properties, and relations. Variationssuch as situational calculus introduce timeso as to represent states, events, and pro-cesses. If we extend the idea of knowledgeto include images and other sense modali-ties, we might need radically different kindsof representation. For now, predicate calcu-lus provides a good starting point for ontol-ogy-sharing technologies.

    Using a logical notation for writing andsharing ontologies does not imply any com-mitment to implementing a rela ted knowl-edge system or a related logic. We are simplytaking a knowledge-level5stance in describ-ing the knowledge system, whatever themeans of implementation. In this view, wecan ask of any intelligent system, even oneimplemented as a neural network, Whatdoes the system know?

    Use of ontologiesIn AI, knowledge in computer systems isthoughtof as something that is explicitlyrep-

    resented and operated on by inference pro-cesses. However, that is an overly narrowview. All information systems tr&c in knowl-edge. Any software that does anything usefulcannot be written without a commitment to amodel of the relevant world-to entities, prop-

    JANUARY/FEBRUARY 1999

    erties, and relations in that world. Data struc-tures and procedures implicitly or explicitlymake commitments to a domain ontology. Itis common to ask whether a payroll systemknows about the new tax law, or whether adatabase system knows about employeesalaries. Information-retrievalsystems, digi-tal libraries, integration of heterogeneousinformation sources, and Internet searchengines need domain ontologies to organizeinformation and direct the search processes.For example, a search engine has categoriesand subcategories that help organize thesearch. The search-engine community com-monly refers to these categories and subcate-gories as ontologies.

    Object-oriented design of software sys-tems similarly depends on an appropriatedomain ontology. Objects, their attributes,and their procedures more or less mirroraspects of the domain that are relevant to theapplication. Object systems representing auseful analysis of a domain can often hereused for a different application program.Object systems and ontologies emphasizedifferent aspects, but we anticipate that overtime convergence between these technolo-gies will increase. As information systemsmodel large knowledge domains, domainontologies will become as important in gen-eral software systems as in many areas ofAI.

    In AI, while knowledge representationper-vades the entire field, two application areasin particular have depended on a rich body ofknowledge. One of them is natural-languageunderstanding.Ontologies are useful inNLUin two ways. First, domain knowledge oftenplays a crucial role in disambiguation.A well-designed domain ontology provides the basisfor domain knowledge representation. Inaddition, ontologyof a domain helps identifythe semantic categories that are involved inunderstanding discourse in that domain. Forthis use, the ontology plays the role of a con-cept dictionary.In general, for NLU, we need

    23

  • 8/7/2019 what is ontology

    5/7

    RelatedworkThe field of ontology attracts an interdisciplinary mix of researchers,

    both from academ ia and industry. Here we give a selection of referencesthat describe related ontology work. Becau se the literature is vast, a com-plete list is impossible. For an extensive collection of (alphabeticallyordered) links to ontological work, including proceedings and events, seehttp://www.cs.utexas.edu/u.sers/mj&b/related.html.Special issueson ontology

    N. Guarino and R. Poli, The Role of Ontology in the Inform ationTechnology, Intl J. Human-Computer Studies, Vol. 43, Nos. 5/6,Nov.-Dec. 1995, pp. 623-965.G.Van Heijst, A.T. Schreiber, and B.J. Wielinga, U sing ExplicitOntologies in KBS Development, Intl J. Human-Computer Studies,Vol. 46, Nos. 213, Feb.-Mar. 1997. pp. 18 3-292.M. Uschold and A. Tate, Putting Ontologies to Use, KnowledgeEng. Rev.,Vol. 13, No. 1, Mar. 1998, pp. 1-3.

    Ontology developmentJ. Benjamin et al., Ontology Construction for Technical Domains,Proc. EKAW 96: European Knowledge Acquisition Workshop, k c -ture Notes in Artificial Intelligence No. 10 76, Springer-Verlag.Berlin, 1996, pp. 98-1 14.W.N.Borst and J.M. Akkermans, Engineering Ontologies, Intl J.Human-Computer Studies, Vol. 46, Nos. 213, Feb.-Mar. 1997 , pp.3 6 5 4 0 6 .A. Farquhar, R.Fikes, and J. Rice, The Ontolingua Se rver: A Toolfor Collaborative Ontology Construction, Intl J . Human-ComputerStudies, Vol. 46, No . 6, June 1997, pp.707-728.A. Gomez-Perez, A. Fernandez, and M.D. Vicente, Towards aMethod to Conceptualize Domain Ontologies, Working Notes 1996European Con$ Artificial Intelligence (ECAI 96)Workshop on Onto-logical Eng., ECC AI, Budapest, Hungary, 19 96, pp. 41-52.T.R. Gruber, Towards Principles for the Design of Ontologies U sed-

    for Knowledge Sharing, Intl J. Human-Computer Studies, Vol. 43,Nos. 516, Nov.-Dec. 1995,pp. 907-928.R. Studer, V.R. Benjam ins, and D. Fensel, Knowledge Engineering,Principles, and Methods, Data and Knowledge Eng., Vol. 25, Mar.M. Uschold and M. Gruninger, Ontologies: Principles, Methods,and Applications, Knowledge Eng. Rev.. Vol. 11, No. 2, Mar. 1996,1998, pp. 161-197.

    pp. 93-155.Natural-language ontologyJ.A. Bateman, B. Magini, and F. Rinaldi, The Generalized UpperModel, Working Papers 1994 European Con$ Artificial Intelligence(ECAI 94)Workshop on Implemented Ontologies, 1994, pp. 34-45;http://www.darmstadt.gmd.de/publish/komet/papers/ecai94.p~.K. Knight and S. Luk, Building a Large-scale Knowledge Ba se forMachine Translation, Proc. AAAI 94,AAAI Press, Menlo Park,Calif. 1994.G.A. Miller, Wordnet: An Online Lexical Database, Intl J.Lexicography, Vol. 3 , No . 4, 1990, pp. 235-312.P.E. Van de Vet, P.H. Speel, and N.J.I. Mars, The Plinius Ontology ofCeramic Materials, Working Papers 1994 European Con$ ArtificialIntelligence (ECAI 94 )Workshop on Implemented Ontologies,ECC AI, Amsterdam , 19 94, pp. 187-206.

    Ontologies and info rmationsourcesY. Arens et al., Retrieving and Integrating D ata from M ultiple Infor-mation Sources, Intl J. Intelligent and Co operutive InformationSystems, Vol. 2 , No. 2, 1993 , pp. 127-158.S . Chawathe,H.Garcia-Molina, and J. Widom, Flexible ConstraintManagement for Autonomous Distributed Databases, IEEE DataEng. Bulletin, Vol. 17, No. 2, 1994, pp. 23-27.S. Decker et al., Ontobroker: Ontology-Based Access to Distributedand Semi-Structured Information, Semantic Issues in MultimediaSystems, R. Meersman et al., eds., Kluwer Academic Publishers,Boston, 1999.

    -

    both a general-purpose upper ontology and adomain-specific ontology that focuses on thedomain of discourse (such as military com-munications or business stories). CYC, Word-net,8 and SensusI5 are examples of sharableontologies that have been used for languageunderstanding.

    Knowledge-based problem solving is thesecond area in AI that is a big consumer ofknowledge. KBPS systems solve a variety ofproblems-such as diagnosis, planning, anddesign-by using a rich body of knowledge.Currently, KBPS systems employ domain-specific knowledge, which is often sufficientfor constructing knowledge systems that tar-get specific application areas and tasks. How-ever, even in specific application areas,knowledge systems can fail catastrophicallywhen they are pushed to the edge of the capa-bility of the domain-specific knowledge. Inresponse to this particular shortcoming,researchers have proposed that problem-solving systems need commonsense knowl-

    edge in addition to domain-specific knowl-edge. The initial motivation for CYC was toprovide such a body of sharable common-sense knowledge for knowledge-based sys-tems. There is a similar need for developingdomain-specific knowledge. Thus, ontology-based knowledge-base development providesa double advantage. The ontologies them-selves are sharable. With these ontologies,we can build knowledge bases using thestructure of conceptualizations to encodespecific pieces of knowledge. The knowledgebases that we develop using these ontologiescan be shared more reliably, because the for-mal ontology that underlies them can helpclarify the representations semantics.

    Information systems and NLU systemsneed factual knowledge about their domainsof discourse. The inferences they make areusually simple. Problem-solving systems, incontrast, engage in complex sequences ofinferences to achieve their goals. Such sys-tems need t o have reasoning strategies t ha t

    24

    enable them to choose among altemative rea-soning paths. Ontology specification inknowledge systems has two dimensions:9 Domain factual knowledge providesknowledge about the objective realities in

    the domain of interest (objects, relations,events, states, causal relations, and soforth).Problem-solving knowledge providesknowledge about how to achieve variousgoals.A piece of this knowledge might bein the form of a problem-solving methodspecifying-in a domain-independentmanner-how to accomplish a class ofgoals.

    Most early research in KBPS mixed fac-tual and problem-solving knowledge intohighly domain-specificrules, called domainknowledge. As research progressed, it be-came clear that there were systematic com-monalities in reasoning strategies between

    IEEE INTELLIGENT SYSTEMS

    http://www.cs/http://ww/http://ww/http://www.cs/
  • 8/7/2019 what is ontology

    6/7

    S. Luke et al., Ontology-Based Web Agents, Proc. F i h Intl Con$Autonomous Agents, ACM Press, New York, 1997, pp. 59-66; http:llwww.cs.umd.edu/projects/pluslSHOEl1997.S.T. Polyak et al., Applying the Proce ss Interchange Format (PIF) toa Supp ly Chain Proce ss Interoperability Scenario, Proc. 1998 Euro-pean Con$ Artificial Intelligence (ECAI 98)Workshop on Applica-tionsof Ontologies and Problem-Solving Methods, ECCAI, Brighton,England, 1998, pp. 88-96.G. Wiederhold, Intelligent Integration of Information, J. IntelligentInformation Systems, Vol. 6, Nos. 213, 1996.G. Wiederhold and M. Genesereth, T he Conceptual Basis for Medi-ation Services, IEEE Intelligent Systems, Vol. 12, No. 5, Sept./Oct.1997, pp. 38-47.

    Ontologies and knowledge managementA. Abecker et al., Toward a Technology for Organizational Memories,IEEElntelligent Systems, Vol. 13, No. 3, MayIJune 1998, pp. 4 M 8 .V.R. Benjamins and D. Fensel, The Ontological E ngineering Initia-tive (KA)2, Formal Ontology in Information Systems, N. Guarino,ed., 10s Press, Amsterdam, 1998, pp. 287-301.M.S. Fox, J. Chionglo, and F. Fadel, A Common-Sense Model of theEnterprise, Proc. Industrial Eng. Research Con$, Inst. for IndustrialEngineers, Norcross, Ga., 1993, pp. 425 42 9.Manual of the Toronto Virtual Enterprise, tech. report, EnterpriseIntegration Laboratory, Dept. of Industrial Eng ., Univ. of Toronto,Toronto, 1995.M. Uschold et al., The Enterprise Ontology, Knowledge Eng. Rev.,Vol. 13,N o. 1,M ar. 1998.

    Task and method ontologiesD. Fensel et al., Using Ontologies for Defining Tasks, Problem-Solving Methods, and T heir Mappings, Knowledge Acquisition,Modeling, and Management, E. Plaza an d V.R. Benjamins, ed s.,Springer-Verlag, Berlin, 1997, pp. 113-128.J.H. Gennari et al., Mapping Domains to Methods in Suppport of

    g o a l s o f s i m i l a r t yp e s . T h e s e r e a s o n i n gs t r a t eg ies w er e a l s o cha rac te r i zed by the i rn e e d f o r s p e c i fi c t y p e s o f d o m a i n f a c tu a lknow ledge . I t soon becam e c lea r tha t s t r a te -g i c k n o w l e d g e c o u l d be abs t r ac ted andreused.

    W i t h few exceptions , I6 . l 7 t h e d o m a i n fac-t ua l know ledge d imens ion d r ives the focusof m o s t o f t h e AI inves t igat ions on on to lo -gies . This is because applicat ions to langu ageunder s t and ing mo t iva tes much o f the w orkon ontologies . Even CYC, w hich wa s origi-na l ly mo t iva ted by the need fo r know ledges ys tems to have w or ld know ledge , has beentes ted more in na tu r a l - l anguage than inknow ledge- s ys tems app l i ca tions .

    PS R E S E A R C H E R S R E A L I Z E Dtha t , i n add i t ion to f ac tua l know ledge , the r ei s know ledge abou t how to ach ieve p rob lem-solving goals.In fact , th is emph as is on meth-

    Reuse, Intl J . Human-Computer Studies, Vol. 41, N o. 3, Sept. 1994,pp. 399-424.A. Tate, Roots of SPAR-Shared Planning and Activity Represen-tation,Knowledge Eng. Rev., Vol. 13, No. 1 , Mar. 1998, pp.121-128.Y.A. Tijerino and R. Mizoguchi, Multis 11: Enabling End-Usersto Design Prob lem-Solving Engines via Two-Level Task Ontolo-gies, Proc. EKAW 93: Seventh European W orkshop on Know-ledge Acquisition fo r Knowledge-Based Systems, Lecture Notes inArtificial Intelligence N o. 723, Springer-Verlag, 199 3, pp.340-359.

    Ontology workshops

    .

    Applications of Ontologies and Problem-Solving Methods, ECAI 98(European Conf. AI), http://delicias.dia.ji. upm. es/WORKSHOP/ECAI98/index. htmlBuilding, Maintaining, and Using Organizational Mem ories, ECA I98,http://www.aijb. uni-karlsruhe. de/WBS/ECA198O M/Formal Ontologies in Information Systems (FOIS 98), hrtp://krzirst.itc.it: 1024/fois98/program. htmlIntelligent Information Integration, ECAI 98, http://www. tzi.de/grp/i3/ws-ecai98/Sharable and Reusable Com ponents for Knowledge Systems, KAW98 (Workshop on Knowledge Acquisition, Modeling, and Manage-ment), http://ksi.cpsc.ucalgary.cu/KAW/kAW98/KAW98Pr oc.htmlOntological Engineering, AAAI Sp ring Symp. Series, Stanford,Calif ., 1997,http://www.aaai.org/SymposidSpring/I 997/sss-97. htmlProblem-Solving Methods, IJCAI 97 (Intl Joint Conf. AI), htfp://www.aifl .uni-karlsruhe. de/WBS/dfe/PSM/m ain. htmlOntological Engineering, ECAI 96,http://wwwis.cs.utwente.nl:8080kbs/EcaiWorkshophomepage. htmlPractical Aspects of Ontology Development, AAAI 96Sharable and Reusable Ontologies, KAW 96,http://ksi.cpsc.ucalgary. cdKAW/KAW96/KAW96Proc. htmlSharable and Reusable Problem-Solving Methods, KAW 96, http://ksi. cpsc.ucalgaryca/KAW/KAW96/kAW96Proc. html

    ods app rop r ia t e f o r d i f fe r en t types of prob-lem s fue led s econd-genera t ion r es ea r ch ink n o w l e d g e s y s t e m s .I s M o s t o f t h e KBPScommuni ty s w ork on knowledge represen-tat ion is n o t well-known t o t h e g e n e r alknowledge-representat ion comm unity . In thecomi ng yea r s, we e x p e c t a n i n c r ea s e d f o c u so n m e t h o d o n t o l o g ie s as a s ha rab le know l -edg e r es ou rce.

    AcknowledgmentsThis article is based on work supported by theOffice of Naval Research under Grant N00014-96-1-0701.We gratefully acknowledge the support ofONR and the DARPA RaDEO program. Any opin-ions, findings, and conclusions or recommenda-tions expressed in this publication a re those of theauthors and do not necessarily reflect the views ofONR. Netherlands Computer Science ResearchFoundation supported Richard Benjamins withfinancial support from the Netherlands Organiza-tion for Scientific Research ( W O ) .

    References1. D.B. Lenat and R.V. Guha, Building LargeKnowledge-Based Systems: Representationand Inference in the CYC Project, Addison-Wesley, Reading, Mass., 1990.2. B. Chandrasekaran, AI, Knowledge, and theQuest for Smart Systems, IEEE Expert, Vol.9, No. 6, Dec. 1994, pp. 2-6.3 . J . McCarthy and P.J. Hayes, Som e Philo-sophical Problems from the Standpoint ofArtificial Intelligence, Machine IntelligenceVol.4, B. Meltze r and D. Michie, eds., Edin-burgh University Press, Edinburgh, 1969, pp.

    463-502.4. D. Man ; Vision: A Computational Investiga-

    tion into the Human Representation and Pro-cessing of V isual Information, W.H. Freeman,San Francisco, 1982.5. A. Newell, The Know ledge Level, Artifi-cial Intelligence, Vol. 18, 1982, pp. 87-127.6. R. Wieringa and W. de Jonge, Object Iden-

    JANUARY/FEBRUARY 1999 25

    http://delicias.dia.ji/http://www/http://www/http://www/http://www/http://ksi.cpsc.ucalgary.cu/KAW/kAW98/KAW98Prochttp://www/http://www/http://wwwis.cs.utwente.nl/http://wwwis.cs.utwente.nl/http://ksi.cpsc/http://ksi.cpsc/http://ksi.cpsc/http://wwwis.cs.utwente.nl/http://www/http://ksi.cpsc.ucalgary.cu/KAW/kAW98/KAW98Prochttp://www/http://www/http://delicias.dia.ji/
  • 8/7/2019 what is ontology

    7/7

    . ...__..- - .

    Infelligent AgentsGuest Editor Jim He ndler of DARPA wi ll pre sent articles discussing develo pme nt techiques for an d thepractical application of intelligent agents, which mig ht be the solution to handling the d ata andinforma tion explosion brought ab out by the Internet. Scheduled topics includeAgents for the masses: Is it possible to develop sophisticated agents simple enough

    to be practical?Extempo Systems interac tive characters, who engage, assist, a nd enterta in pe opleon the WebAgentsofts efforts at co mmercializing intelligent agen t technology

    lnfellige nt Systemswill also continue its coverage of the ontologies track, started i n this issue, andthe track on vision-based vehicle guidance, w hich began in the Novemb edDecem ber 1998 issue./FEE lntellige nf Systems covers the ful l range of intelligent system developments for the AIpractitioner, researcher, educator, ond user.

    / E Intelligent Systemstifiers, Keys and Surrogates: Object Identi-fiers Revisited, Theory and Practice ofObject Systems (TAPOS),Vol. 1,No. 2,1995,pp. 101-114.

    7. J.A. Bateman, B. Magini, and F. Rinaldi, TheGeneralized Upper Model, Working Papers1994 European Con5 Artificial Intelligence(ECAI 94)Workshopon Implemented Ontolo-gies, 1994, pp. 34 45 ; http://www.darmstadt.gmd.de/publishflcomet/papers/ecai94.p~.

    8. G.A. Miller, Wordnet: An Online LexicalDatabase, Intl J. Lexicography, Vol. 3, No.4,1990, pp. 235-312.

    9. N. Fridman Noy and C.D. Hafner, The Stateof the Art in Ontology Design,AZMagazine,Vol. 18, No. 3, 1997, pp. 53-74.10. D. Fensel and V.R. Benjamins, The Ro le ofAssumptions in Knowledge Engineering,Intl J. Intelligent Systems, Vol. 13, No. 7,1998, pp. 715-747.1 1. M.R. Genesereth and R.E. Fikes, KnowledgeInterchange Format, Version 0.3,KnowledgeSystem s Lab., Stanford Univ., Stanford, Calif.1992.12 . R.Neches et al., Enabling Technology forKnowledge Sharing, AZ Ma gaz ine, Vol. 12,NO. 3, 1991, pp. 36-56.13. T.R. Gruber, A Translation Approach toPortable Ontology Specifications, Knowl-edge Acquisition, Vol.5,1993, pp. 199-220.14. G. Schreiber et al. , CommonKADS: AComprehens ive Methodology for KBSDevelopment, ZEEE Expert, Vol. 9, No. 6 ,

    Dec. 1994, pp, 28-37.15. K. Knight and S. Luk. Building a Large-Scale Knowledge Base for Machin e Transla-tion, Pmc.Am . Ass oc. Arti$cial Intelligence,AAAI Press,Menlo Park, Calif., 1994.16. D. Fensel et al., Using Ontologies for Defin-in g Tasks, Problem-Solving Methods andTheir Mappings, Knowledge Acquisition,

    Modeling and Management, E. Plaza a nd V.R.Benjamins, eds., Springer-Verlag, New York,1997, pp. 113-128.17. R. Mizog uchi, J. Van Welkenhuysen, and M.Ikeda, Task Ontology for Reuse of ProblemSolving Knowledge, Towards Very LargeKnowledge Bases, N.J.I. Mars, ed., 10sPress,Amsterdam, 1995.18. J.M . David, J.P. Krivine, and R. Simmons ,Second Gener ation Expert Systems, Springer-Verlag, 1993.

    B. Chandrasekaran is professor emeritus, asenior research scientist, and the director of theLaboratory for AI Research (LAIR) in the Depart-ment of Computer and Information Science atOhio State University. His research focuses onknowledge-based systems, causal understanding,diagram matic-re asoning, and cognitive architec-tures. He received his BE from Madras U niversityand his PhD from the University of Pennsylvania,both in electrical engin eering. He was Editor-in-Chief of IEEE Expert from 1990 to 1994, and heserves on the editorial boards of numerous inter-national journals. He is a fellow of the IEEE,AAAI, and ACM. Conta ct him at the LaboratoryforAIResearch, Ohio State Univ., Columbus, OH ,

    26

    43210; [email protected]; http://www.cis.ohio-state.edu/lair/.JohnR. Josephson is a research scientist and theassociate director of the Laboratory for AIResearch in the Department of Computer andInformation Scien ce at Ohio State University. Hisprimary research interests are knowledge-basedsystems, abductive inference, causal reasoning,theory formation, speech recognition, perception,diagnosis, the logic of investigation, and the fo un-dations of science. He received his BS and MS inmathematics and his PhD in philosophy, all fromOhio State University. He has worked in severalapplication domains, including m edical diagnosis,medic al test interpretation , diagnosis of enginee redsystems, logistics planning, speech recognition,molecular biology, design of electromechanicalsystems, and interpretation of aerial photographs.He is the coeditor with Susan Josephson o f d b d u c -tive Inference: C omputation, Philosophy, Tech-nology, Cambridge Univ. Press, 1994.Contact himat the Laboratory for AI Research, Ohio StateUniv., Columbus, OH, 43210; [email protected]; http://www.cis.ohio-state.edullair/.Richard Benjaminsis a senior researcher and lec-turer at the Department of Social Science Infor-matics at the University of Amsterdam. Hisresearch interests include knowledge engineering,problem-solving methods and ontologies, diagno-sis and planning, and AI and th e Web. He obtainedhis BS in cognitive psychology and his PhD in arti-ficial intelligence from the University of Amster-dam. Contact him at the Dept. of Social ScienceInformatics, Univ. of A msterdam, Roetersstraat15 , 1018 WB Amsterdam, The Netherlands;[email protected];http://www.swi.psy.uva.nl/usr/richard/home.html.

    IEEE INTELLIGENTSYSTEMS

    http://www.darmstadt/http://www/http://www/http://www.cis.ohio-state.edullair/http://www.swi.psy.uva/http://www.swi.psy.uva/http://www.cis.ohio-state.edullair/http://www/http://www.darmstadt/

Recommended