1 / 74

CPE/CSC 580: Knowledge Management

CPE/CSC 580: Knowledge Management. Dr. Franz J. Kurfess Computer Science Department Cal Poly. Introduction Knowledge Processing Knowledge Acquisition, Representation and Manipulation Usability and Knowledge Effective Use Knowledge Organization Classification, Categorization

Antony
Download Presentation

CPE/CSC 580: Knowledge Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CPE/CSC 580: Knowledge Management Dr. Franz J. Kurfess Computer Science Department Cal Poly

  2. Introduction Knowledge Processing Knowledge Acquisition, Representation and Manipulation Usability and Knowledge Effective Use Knowledge Organization Classification, Categorization Ontologies, Taxonomies, Thesauri Knowledge Retrieval Information Retrieval Knowledge Navigation Knowledge Presentation Knowledge Visualization Knowledge Exchange Knowledge Capture, Transfer, and Distribution Knowledge Management Techniques Topic Maps, Agents Knowledge Management Tools Ontology Development Reasoning Knowledge Management in Organizations Content Management Systems Knowledge Sharing Course Overview

  3. Introduction usability of tools and systems vs. usability of knowledge Usability Evaluations Usability Frameworks Usability Considerations for Knowledge Management Important Concepts and Terms Chapter Summary Overview Usability of Knowledge

  4. Logistics • Tablet PCs • ConfTool installed at http://wiki.csc.calpoly.edu/conftool/ • Term Project • Milestone Week 2 should be on the team Web page (see http://www.csc.calpoly.edu/~fkurfess/Courses/581/S07/Project/Team-Web-Pages.shtml) • Assignments • A1 (KM Tools) due Thu, April 19 • Paper and Presentation • topic proposal due “Week 3” • please submit via • Blackboard and • ConfTool (title, abstract, topic, keywords - no outline, bibliography)

  5. Usability Evaluations • formative evaluation • done at different stages of development • influences the design of the system as it is being developed • relies on quick feedback from users • or other ways to obtain feedback • summative evaluation • assesses the quality of a finished product • no influence during design and development • users can evaluate the actual product

  6. Four evaluation paradigms • ‘quick and dirty’ • usability testing • field studies • predictive evaluation

  7. Quick and dirty • ‘quick & dirty’ evaluation describes a common practice • designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked. • Quick & dirty evaluations can be done any time • The emphasis is on fast input to the design process rather than carefully documented findings.

  8. Usability Testing • recording the performance of typical users • on typical tasks in controlled settings • field observations may also be used • users are watched • recorded on video • their activities are logged • mouse movements, key presses • evaluation • calculation of performance times • identification of errors • explanation why the users did what they did • user satisfaction • questionnaires and interviews are used to elicit the opinions of users

  9. Field Studies • done in natural settings • to understand what users do naturally and how technology impacts them • in product design field studies can be used to- identify opportunities for new technology- determine design requirements - decide how best to introduce new technology- evaluate technology in use

  10. Predictive Evaluation • experts apply their knowledge of typical users to predict usability problems • often guided by heuristics • another approach involves theoretical models • users need not be present • relatively quick & inexpensive

  11. DECIDE: A framework to guide evaluation • Determine the goals the evaluation addresses. • Explore the specific questions to be answered. • Choose the evaluationparadigm and techniques to answer the questions. • Identify the practical issues. • Decide how to deal with the ethical issues. • Evaluate, interpret and present the data.

  12. Determine the Goals • high-level goals of the evaluation • stakeholders • in the overall system • for the specific evaluation • selection of the usability evaluation paradigm • probably influenced by the goals • examples of goals • Identify the best presentation method for knowledge. • Check to ensure that different access methods are consistent. • Investigate how technology affects the usage of knowledge. • Improve the usability of an existing product, work flow, or common practice.

  13. Explore the Questions • questions to be asked during usability evaluations • hypothesis (“research question”) • questions can be used to clarify aspects of the goals • may include questions used during interactions with evaluation participants • examples for questions related to knowledge • Why do users need/want to know this? • How do users act on this knowledge? • What happens if they do their task without this knowledge? • What is the source of this knowledge? Who is responsible for veracity, maintenance, access control?

  14. Four evaluation paradigms ‘quick and dirty’ usability testing field studies predictive evaluation Techniques observing users asking users about their opinions asking experts about their opinions testing the performance of users modeling the task performance of users Choose the Evaluation Paradigm & Techniques

  15. Identify Practical Issues • selection of users • types of users • size of participant pool • budget • schedule • evaluators • internal/external • facilities and equipment • usability lab • recording equipment

  16. Decide on Ethical Issues • informed consent form • participants have a right to • know the goals of the study • what will happen to the findings • privacy of personal information • not to be quoted without their agreement • leave when they wish • be treated politely

  17. Evaluate, Interpret and Present Data • may depend on the paradigm and techniques used • evaluation aspects • Reliability: can the study be replicated? • Validity: is it measuring what you thought? • Biases: is the process creating biases? • Scope: can the findings be generalized? • Ecological validity: is the environment of the study influencing it • e.g. Hawthorn effect

  18. Usability Considerations for KM • knowledge-intensive tasks and activities • acquisition, organization, manipulation, retrieval, presentation of knowledge • knowledge-centric interaction methods • text-based, visual, auditory • tools for knowledge management • generic categories, specific tools as examples • usability measures for knowledge-intensive tasks • qualitative / quantitative • subjective / objective

  19. Activity: Knowledge Usability in Student Research • Scenario: A student (team) needs to do investigate a topic, e.g. to prepare a research paper, or to work on a project. • Task: Identify activities, methods, tools, and usability measures for this scenario. • Deliverable: A document created with a tool of your choice that presents the knowledge your team collected about knowledge usability • to be posted on Blackboard AI Discussion Board

  20. Activity Worksheet :Knowledge Usability • Scenario description: • describe the sample scenario that serves as the basis for your investigation • Tasks and Activities: • what are the critical tasks and activities related to dealing with knowledge • Interaction Methods: • how doe you interact with the computer to deal with the knowledge • KM Tools: • what are the tools you’re using • Usability Measures • how do you measure the usability of the tools and methods

  21. Activity Worksheet:Knowledge Usability • Scenario description: • Tasks and Activities: • Interaction Methods: • KM Tools: • Usability Measures

  22. Activity Lecture Preparation:Knowledge Usability • Scenario description: • an instructor needs to prepare a lecture for a Computer Science class • the material will be presented to class in a face-to-face session with computer-based presentation tools • Tasks and Activities: • determination of the topic • based on course catalog description, extended course outline, textbook, related courses • identification of essential concepts and terms • acquisition of knowledge about concepts and terms • possibly recursive until the desired level of detail is reached • creation of a framework for the arrangement of the concepts • relationship between concepts, in particular dependencies • presentation of the knowledge • sequence, method (natural language in spoken or written form, grapic, demo, simulation) • Interaction Methods: • search for related material (documents) • organization of knowledge (hierarchical and graph-based frameworks) • KM Tools: • search (Google, Google Scholar, CiteSeer, IEEE Digital Library, PolyCat, Spotlight) • knowledge collection utility (Google Notebook, Zotero, Scrapbook, DevonThink) • knowledge organization tools (outliner, concept mapping, ontology editor) • Usability Measures • ratio of useful/irrelevant documents (precision, recall) • effort to perform an activity or task (time, basic actions, cost, utilization of resources) • user satisfaction • “pleasantness”

  23. Key points • An evaluation paradigm is an approach that is influenced by particular theories and philosophies. • Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users. • An evaluation framework like DECIDE offers some guidance for the practical aspects of usability evaluations

  24. apply the DECIDE framework to your team project in this class specifically to the usability of knowledge distinguish between usability of the tool or system, and usability of the knowledge the system deals with Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluationparadigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data. Activity: DECIDE Framework

  25. Using Knowledge • What do we want when we look for information or knowledge? • answers, not documents! • current retrieval systems identify documents that may or may not contain the answer • irrelevant documents • partial answers • multiple answers • inconsistent, wrong context, …

  26. Current Usage of Retrieval Systems • tools to identify potentially relevant documents • formulation of questions as unnatural queries • either simplistic sets of keywords, or complex expressions • ranking of retrieved documents according to obscure criteria • re-formulation of queries to influence ranking • mostly batch processing • submit query • wait • view result

  27. Better Usage of Retrieval Systems • provide answers to questions • find the right information fast • analyze information, combine it into easily digestible formats • summarize longer documents, sets of related documents • relate it to decisions to be made

  28. Pre-Test

  29. Motivation

  30. Objectives

  31. Knowledge Usage in Context • terminology • dissemination, utilization, diffusion, technology transfer • knowledge dissemination and utilization • traditional model • conceptual and instrumental use • related terms • knowledge use as learning process • knowledge life span • usefulness and accessibility of knowledge • knowledge life cycle • activities dealing with knowledge throughout its useful life

  32. What is Knowledge Utilization? • no clear definition • often used interchangeably with • knowledge dissemination • knowledge transfer • knowledge usage • usually assume two aspects • distribution of knowledge, information, or products • incorporation of conceptual or instrumental use of knowledge into relevant activities

  33. Knowledge Dissemination and Utilization • due to a number of factors, existing knowledge is not used effectively • driven by the dissemination side (researchers), rather than the knowledge use side (practitioners) • finding practical applications of knowledge is often left to potential users • lack of coordinated knowledge utilization activities • ad hoc dissemination models, very few attempts at systematic approaches to utilization • cumbersome accessibility • finding and accessing knowledge has been the domain of specialists (librarians, consultants)

  34. Dissemination Types • spread • one-way diffusion or distribution of knowledge and information • choice • users actively seek and acquire knowledge from established or alternative sources • users learn about their options • exchange • interactions between people • multi-directional flow of knowledge and information • implementation • technical assistance, training, interpersonal activities [NCDDR 1996]

  35. Extension Model of Knowledge Use • rational, linear conception of the process of knowledge utilization • knowledge is packaged and moved from one place to another • based on the assumption that knowledge can be arranged into definable, useable units that can be transmitted easily • “getting the word out” • based on the hope that potential users will hear about it, and be willing and capable to utilize it • does not reflect the use of knowledge in many situations [NCDDR 1996]

  36. Complex Model of Knowledge Use • the process usually is not rational nor linear • complex • multiple sources, multiple media and paths of delivery • interdependencies between individual knowledge items • context may be critical • transactional • may involve transactions between source (expert) and user • dependent on the background of potential users • pre-existing knowledge, beliefs, experiences • the user is involved in the usage process • problem-solver • constructor of a personal knowledge base [NCDDR 1996]

  37. Knowledge Usage • conceptual use • changes in levels of knowledge, understanding, or attitude • instrumental use • changes in behavior and practice • strategic use • manipulation of knowledge to attain specific goals • power, profit, political gain [NCDDR 1996]

  38. Knowledge Usage Metaphors • “tabula rasa” • the learner’s mind is an empty slate upon which people “in the know” impress knowledge • learner as a sponge • soaking up knowledge, largely without filtering or processing • brain as a computer • processes information in a systematic fashion as it is received from outside sources

  39. Knowledge Use as Learning Process • role of knowledge • dynamic set of understandings influenced by its originators and its users • role of the learner • actively filters and shapes knowledge • integration into existing knowledge • constructs models of the the environment • explanations to make sense of the world • pre-existing (mis-)understandings may have to be changed • they result in discrepancies of the mental model [NCDDR 1996]

  40. Dimensions of Knowledge Utilization • dissemination source • originator of information • initiator of dissemination • content • new knowledge • supporting information • dissemination medium • packaging and transmission of knowledge • user • person or organization to receive and apply the knowledge [NCDDR 1996]

  41. Knowledge Life Span [Kaplan 1997]

  42. Knowledge Life Cycle [Kaplan 1997]

  43. Utilization of Knowledge Assets [Konno 2000]

  44. Knowledge Usage Issues • selection of knowledge • composition of knowledge • merging of knowledge items • modification of knowledge • modification of system aspects • preservation of consistency • user motivation

  45. Knowledge Usage Template [Skyrme 1999]

  46. KM Benefits Tree [Skyrme 1999]

  47. Technology vs. Usage Task Context Browse Transfer into Search Organize Gather Query Collaborate Store Output Select Create Think Evaluate Plan Today’s Technology Centered Systems User and Usage Centered [Griffin 2000]

More Related