1 / 55

Data and Knowledge Representation Lecture 3

Data and Knowledge Representation Lecture 3. Qing Zeng, Ph.D. Last Time We Talked About. Boolean Algebra Predicate Logic (First order logic). Today We Will Talk About. Ontology Major KR Schemes. Tell me what’s in this room. Tables, chairs, windows, computers, papers, pens, people, etc..

carys
Download Presentation

Data and Knowledge Representation Lecture 3

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data and Knowledge RepresentationLecture 3 Qing Zeng, Ph.D.

  2. Last Time We Talked About • Boolean Algebra • Predicate Logic (First order logic)

  3. Today We Will Talk About • Ontology • Major KR Schemes

  4. Tell me what’s in this room • Tables, chairs, windows, computers, papers, pens, people, etc.. • We can write • But what is a table? What is a room? • Logic has no vocabulary of its own

  5. Ontology Fills the Gap • Ontology is a study of existence, of all kinds of existence, of all kinds of entities • It supplies the predicates of predicate logic and labels that fill the boxes and circles of conceptual graph

  6. Webster’s Definition of Ontology • “1: a branch of metaphysics concerned with the nature and relations of being2: a particular theory about the nature of being or the kinds of existents” -- http://www.webster.com/cgi-bin/dictionary

  7. My Simplified Understanding • Ontology seeks to describe entities through classification of relations among entities • Domain ontology limits the its scope to a specific domain such as medicine • In informatics, we further limit domain ontology to what is needed by a application or certain kinds of applications such clinical guideline, retrieval of pathology information

  8. Why Ontology in Biomedical Domain • Encode data • E.g. Patient A is diabetic and HIV positive • Represent knowledge • E.g. Blood Glucose test is a diagnostic test for diabetes.

  9. Sources of Ontology • Observation: provides knowledge of the physical world • Reasoning: make sense of observation by generating a framework of abstractions called “metaphysics”.

  10. Ontology Development in Biomedical Domain • Areas that directly involve ontology • Data model • Vocabulary/terminology • Knowledge based system

  11. Philosopher’s Approach to Ontology • Top-down • Concerned with the entire universe • Build top level ontology first • Long history • Lao Zi (Book of Tao) • Plato • Aristotle • Kant (1787)

  12. Computer/Information Science’s Approach • Bottom Up • Start with limited world or specific applications • Exception: Cyc system • Designed with computing in mind • Short History • First use of the term “ontology” in computer science community: McCarthy, J. 1980 “Circumscription – A Form of Non-Monotonic Reasoning”, Artificial Intelligence, 5: 13, 27–39.

  13. Problems Faced by Computer/Information Scientists • Tower of Babel • Ontology used/developed by different groups for applications • Terminological and conceptual incompatibilities • Problem arise in system development and maintenance as well as data/knowledge exchange • Insufficient expressive power

  14. Example • Problem Oriented Medical Record • Weed LL. Medical records that guide and teach. 1968. MD Comput. 1993 Mar-Apr;10(2):100-14. • Where “SOAP” comes from… • The gist: organizing medical data/information by patient problem • Many EMRs has a place for “problem list”

  15. Example • Which one of the following is a “problem” • Cough • Anxiety • Pregnancy • Sleep disorder • Rash • Physicians can not agree • Cited by a number of POEMRs as one of the reasons of failure

  16. Another Example • What does “acute” mean? • sharpness or severity e.g. acute pain • having a sudden onset, sharp rise, and short course, e.g. acute pancreatitis • In a data model for finding, we had severity as an attribute. Thus need to decide where acute fit in.

  17. To Solve the Problem • Develop formalism for sharing (e.g. KIF, CGIF) • Develop standard ontology • Develop new formalism to increase expressive power

  18. Ontological Categories • Making a choice on ontological categories is first step in system design – John Sowa • Ontological Categories is • “Class” in OO system • “Domain” in database theory • “type” in AI theory • “type” or “sort” in logic

  19. Ontological Categories • Making a choice on ontological categories is first step in system design – John Sowa • Ontological Categories is • “Class” in OO system • “Domain” in database theory • “type” in AI theory • “type” or “sort” in logic

  20. Brentano’s tree of Aristotle’s Categories Being Accident Substance Property Inherence Relation Directness Containment

  21. CYC Ontology Thing Represented Thing Individual Object Intangible Relationship Event Stuff IntangibleObject Collection

  22. Contrast -> Distinction • All perceptions start with contrast • Bright – dark • Tall – short • Healthy – ill • Happy – sad • Distinction (discrete/continuous) conceptual interpretations of perceptual contrasts

  23. Contrast -> Distinction • All perceptions start with contrast • Bright – dark • Tall – short • Healthy – ill • Happy – sad • Distinction (discrete/continuous) conceptual interpretations of perceptual contrasts

  24. Distinction -> Categories • Distinctions maybe combined to generate categories. E.g. • Classify patients. • Distinctions: (insured, uninsured), (inpatient, outpatient), (infant, child, adult), (emergency, urgent, general)…….. • Categories: insured pediatric emergency patient, uninsured adult inpatient……

  25. Sowa’s Ontology (Peirce and Whitehead) • AXIOMS: • Physical: physical entities have location in space and a point in time. E.g. hand, hair, computer. • Abstract: abstract entities do not have location in space or a point in time. E.g. theorem, knowledge, story.

  26. Sowa’s Ontology • AXIOMS: • Independent: independent entities can exist without being dependent on the existence of another entity. E.g. person, diary, song. • Relative: relative entities require the existence of some other entity. E.g. joints between bones, middle child, remission after a disease episode. • Mediating: mediating entities require the existence of (at least) two other entities and establish new relationship among them. E.g. theory of relativity, diagnostic strategy, cardiovascular system.

  27. Sowa’s Ontology • AXIOMS: • Continuant: has only spatial parts and no temporal parts; identity cannot depend on location in space and time. E.g. gender, alert and reminder system, medication formula. • Occurrant: has both spatial parts (participants) and no temporal parts (stages); can only identify by location in space and time. E.g. disease episode, clinical event, medication order.

  28. Matrix of Central Categories

  29. Exercise Assume you are developing an alert system to monitor errors in laboratory information systems. Identify some distinctions for categorizing the errors and describe which distinctions are in contrast with which other distinctions.

  30. Semantic Network • An long existing notion: there are different pieces of knowledge of world, and they are all linked together through certain semantics.

  31. Basic Components • Nodes • Represent concepts • Arcs • Represent relations • Labels for nodes and arcs

  32. Little Constraint patient Interact Interact Nurse physician Interact

  33. Little Constraint DSG Site Link Link Instructors’ Homepage Course Site Link Web

  34. Relation • Directed or non-directed • Multiple relations between two concepts • Can have different properties • Reflexive (e.g. co-ocurrence) • Transitive (e.g. causal) • Symmetric (e.g. sibling) • ………..

  35. Some Often Used Relations in Biomedical Domain • IS A • IS PART OF • CAUSE OF • MEASURES • CO-OCCURS • …………

  36. Major Limitation • Lack of Semantics • No formal semantic of the relations • E.g. Does “ISA” mean subclass, member, etc? • Possible multiple interpretations • Restricted expressiveness • E.g. can not distinguish between instance and class

  37. Extension • Extending expressivity (distinguish different types of concepts and relations” • Distinguish between “some” and “all” • Distinguish between “existence” and “intension” • Distinguish between “definition” and “assertion” • Add semantic rigor • Map to logic (Sowa – CG)

  38. Frame-based Network • Distinguish instance vs. class • Hierarchical structure (superclass and subclass) • Multiple hierarchy • Slots • Member slot • Own slot

  39. Slot • Frame identifying information • Relationship between frames • Descriptors of requirements for frame match • Procedural information • Default information • Restrictions and constraints • New instance information

  40. Strength • Help organize knowledge hierarchically • Procedure information • Support multiple inheritance

  41. Weakness • Expressiveness (e.g. quantifier) • Inheritance • Sub classing (override slot value) • Multiple inheritance • Large complex knowledge system

  42. Example: MED

  43. Example: Protégé

  44. Example: Protégé

  45. Example: Protégé

  46. Production Rules • Also called IF-THEN rules • Many forms: • IF condition THEN action • IF premise THEN conclusion • IF proposition p1 and proposition p2 are true THEN proposition p3 is true

  47. Components • Rule base • Inference engine • Working memory

  48. Inference • Modus ponens • Forward chaining • Modus tollens • Background chaining

  49. Example: MYCIN IF the identity of the germ is not known with certainty AND the germ is gram-positive AND the morphology of the organism is "rod" AND the germ is aerobic THEN there is a strong probability (0.8) that the germ is of type enterobacteriacae

  50. Example • POINT Main Inference Control Control the execution of inference engine by retrieve and providing needed knowledge Jess Inference Engine Fire rules when adequate knowledge is provided Medical Knowledge base Inference Rules Define semantic relations between concepts Define rules of relevance base on semantic relations between concepts

More Related