1 / 42

EDSS Program Review: June 2002

EDSS Program Review: June 2002. David Jones:APL-UW Jim Ballas: NRL. Outline. David Introduction Project Overview HCI Experience Progress Report. Outline…. Jim UCD Process Task Analysis Evaluations Future Plans. APL-UW Team Members. Bob Miyamoto- PI David Jones

ramiro
Download Presentation

EDSS Program Review: June 2002

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EDSS Program Review: June 2002 David Jones:APL-UW Jim Ballas: NRL Applied Physics Laboratory/ Naval Research Laboratory

  2. Outline • David • Introduction • Project Overview • HCI Experience • Progress Report Applied Physics Laboratory/ Naval Research Laboratory

  3. Outline… • Jim • UCD Process • Task Analysis • Evaluations • Future Plans Applied Physics Laboratory/ Naval Research Laboratory

  4. APL-UW Team Members • Bob Miyamoto- PI • David Jones • Troy Tanner & Bill Kooiman Applied Physics Laboratory/ Naval Research Laboratory

  5. APL-UW Background • Miyamoto’s Group • Env Effects on Sensors • TDA development • Training Tools • Me Applied Physics Laboratory/ Naval Research Laboratory

  6. Project Overview • General Philosophy- the design model fits the user’s mental model: UCD • Perform task analyses that feed into the interface design & its evaluation • Support EDSS developers with UCD and HCI standards and guidelines • Provide iterative feedback Applied Physics Laboratory/ Naval Research Laboratory

  7. Example of UCD Application: DMARS • Study the user’s information needs • Study how the user performs given tasks • Create an intuitive process • Involve users in design Applied Physics Laboratory/ Naval Research Laboratory

  8. High Seas Workflow System • Helps produce the High Seas Warning • Heard from the supervisors • Then heard from actual users • Different stories • Created a system for the users • Flexible design Applied Physics Laboratory/ Naval Research Laboratory

  9. HSW cont… • Funded by DARA & SPAWAR • Written in JAVA • Uses Polexis’ XIS for map functions • Running operationally at SD METOC Center Applied Physics Laboratory/ Naval Research Laboratory

  10. FNC Project: EVIS • Studying human-system component of METOC support • Worked closely with users and customers • Conducted experiments • Performed Cognitive Task Analysis • Gone to sea for evaluation Applied Physics Laboratory/ Naval Research Laboratory

  11. Progress Report • Initial Task Analysis -Jim • Gaining Domain Knowledge • Training Observations- David • Organizing a UCD Workshop Applied Physics Laboratory/ Naval Research Laboratory

  12. Gaining Domain Knowledge • EDSS User’s Guide • Draft Mission Needs Statement for Distributed Collaborative Planning Systems for Expeditionary Forces • COMPHIBGRU THREE 041605Z OCT99 • 4.X Tiger Team User Input Spreadsheet • Pubs: NWP 3—02.1; ATP 3 ch 6, JP 3-02 Applied Physics Laboratory/ Naval Research Laboratory

  13. Training on the USS Tarawa • Attended training in San Diego: Apr • Enthusiastic users • Hands-on training well received • Great audience for usability evaluation Applied Physics Laboratory/ Naval Research Laboratory

  14. Example of User reactions • Staff personnel were excited about creating overlays for planning But.. • Menu headings caused some confusion • “Are Assault Plans part of AOA Mgmt?” • “When do I use the Env DB?” Applied Physics Laboratory/ Naval Research Laboratory

  15. User reactions • Navigation among the different windows was difficult at times • Some windows require expertise that all users might not have or forgot Applied Physics Laboratory/ Naval Research Laboratory

  16. Quick Thoughts after Training • Most users want it to look like Windows • DII UIS provides HCI guidance • Some ideas • Back Arrows & Undo command • Web-based and searchable user’s guide • Tooltip help- On mouse of menu title • Workflow wizards or web-based training Applied Physics Laboratory/ Naval Research Laboratory

  17. UCD Workshop • Scheduled for 30 July 2002 • At SAIC Tysons Corner office • Our ideas • UCD Processes • DII HCI Standards • HCI Design Principles with examples • HCI Evaluation • SAIC Ideas? Applied Physics Laboratory/ Naval Research Laboratory

  18. Jim Ballas • UCD & HCI • Task Analysis • Evaluations • Future Plans Applied Physics Laboratory/ Naval Research Laboratory

  19. NRL-WDC • Team Members • Jim Ballas • Ph.D. in Applied Experimental Psychology • Derek Brock • M.S. in Computer Science, HCI emphasis • Beth Kramer • M.S. in Human Factors Psychology • Janet Stroup • B.A., some graduate CS coursework Applied Physics Laboratory/ Naval Research Laboratory

  20. NRL-WDC: Interface Design and Evaluation Section 4 Ph.Ds on staff of 10 Expertise in HCI, Human factors, Cognitive Science, Computer Science, Auditory perception Projects include AEGIS (with LMC), DDX (with Raytheon), NATO S&T, KSA EVIS Management 6.1 to 6.3 projects Applied Physics Laboratory/ Naval Research Laboratory

  21. NRL-WDC: Interface Design and Evaluation Section HCI Research cited in Major reference documents • Handbook of Human Computer Interaction • ACM CHI Conference • 2001 paper: “Demystifying Direct Manipulation” Wrote and Revised Operator Workstation Evaluation section for IUSW-21 at Sea Test this Sept Applied Physics Laboratory/ Naval Research Laboratory

  22. User Centered Design • Following approach outlined in NATO COADE document • Additional principles: HCI as an instance of language use Applied Physics Laboratory/ Naval Research Laboratory

  23. User Centered Design: COADE Applied Physics Laboratory/ Naval Research Laboratory

  24. User Centered Design: COADE Applied Physics Laboratory/ Naval Research Laboratory

  25. Viewing HCI as an Instance of Language Use • The design and implementation of an effective software application and its user interface is ultimately a communication problem that always involves both the designer’s meaning and the user’s understanding • The principles at work in people’s use language form a comprehensive framework for the design of human-computer interaction Applied Physics Laboratory/ Naval Research Laboratory

  26. Principles of Language Use • Any form of communication between people is an instance of language use; language is used to do things together • Language use requires people to coordinate their actions and their attention (cognition); it always involves “speaker’s” meaning and “addressee’s” understanding. • Meaning and understanding require common ground Applied Physics Laboratory/ Naval Research Laboratory

  27. Common Ground • Common ground is knowledge that people establish they can use with each other on the basis of shared experience • When common ground is missing, meaning and understanding breakdown • Building common ground is always a serial process - even though the resulting shared knowledge may contain gaps Applied Physics Laboratory/ Naval Research Laboratory

  28. Layers in Language Use and in HCI • Language use frequently involves more than one conceptual layer of activity; telling a story, for instance, involves at least two layers: • The story teller and the listener participate as themselves in the first layer • In the second layer, the events of the story take place • Similarly, HCI has two principal layers of activity: • The designer and user participate as themselves in the first layer • In the second layer, the user interacts with the computer as if it (and not the designer) were the user’s counterpart • Each layer in an instance of language use makes different demands of the user’s language use skills Applied Physics Laboratory/ Naval Research Laboratory

  29. Language Use Issues in HCI • In the first layer of HCI, designers, through the software’s presentation, must help users to compensate for gaps that direct access (through menus, etc.) imposes on the process of building coherent common ground • In the second layer, wherever possible, interfaces should be designed to allow users to establish and use common ground with the interface itself as a regular part of their interaction with the computer Applied Physics Laboratory/ Naval Research Laboratory

  30. Guidelines, Standards, and Relevant Literature • DII User Interface Standards MilStd2525 • Research on distributive planning: Klein & Miller • Work directed by NRL • Cited in MCDP-5 • General human factors and HCI literature Applied Physics Laboratory/ Naval Research Laboratory

  31. Work to Date: Initial Task Analysis Partially Complete • Initial Task List:EDSS planning • Make Basic Decisions • Create Operational Area • Determine Landing Craft • Complete Default Craft Parameters Table • Make Navigation Decisions • Design Sea Echelon Areas • Free Hand • 4W Grid • Select HLZ • Select Beach Center and Boat Lane • Design Routes • Design Display • Determine Ship-to-Shore Movement Applied Physics Laboratory/ Naval Research Laboratory

  32. Task Analysis: other tasks • Administrative/file operation tasks • Log On And Initializing • Log Off • Installing New EDSS Software • Exporting Plans • Importing Plans •  GCCS-M Tasks • Saving Slides • Deleting Slides • Exporting Slides • Installing Maps And Charts • Retrieving Maps Or Charts • Removing Maps Or Charts • Uninstalling Maps, Charts, And Imagery • Line Of Sight (Los) Profile Applied Physics Laboratory/ Naval Research Laboratory

  33. Future Work Complete task analysis Including cognitive analysis using Critical Decision Method Workshop: To include illustrations of design issues, e.g. illustration of lighting/ filtering effect on color images Applied Physics Laboratory/ Naval Research Laboratory

  34. Future Work: Evaluation • Approach: evaluate user performance and compare to desired performance • Examples: • DMARS • EVIS • Software Development Tools and Processes Applied Physics Laboratory/ Naval Research Laboratory

  35. DMARS Evaluation Summary • Observe and compare use of three media • Paper (NAVOCEANO Mine Warfare Pilot publication) • WWW (based on MWP, so called RP-WEB • UC-CD (User Centered Digital METOC Acoustic Reference Manual - DMARS) • Five METOC tasks: prepare brief and answer 4 questions • 12 METOC officers and enlisted personnel • NAS Patuxent River • NAS North Island • Each person tested on five tasks using three media Applied Physics Laboratory/ Naval Research Laboratory

  36. DMARS Evaluation: Process Measures • Task timing logged with Activity Catalog Tool a NASA sponsored tool • Coding scheme distinguished following tasks • Acquire information • Browse: search for topic in the METOC document by navigating from one section to another to another (Search for and Move To) • Interpret: read information from a specific section; • Assemble briefing document; • Compose/edit: generate and/or modify the document using a word processor or presentation software; • Copy/paste: copy material from the METOC document into the briefing document. Applied Physics Laboratory/ Naval Research Laboratory

  37. DMARS Evaluation: Outcome measures • Accuracy • On Problem 1, Subject presented brief and experimenter graded eight items • Required analysis, not just picture • Problems 2-5, Subject supplied answers. • Number of images used in briefing • Preference Applied Physics Laboratory/ Naval Research Laboratory

  38. DMARS Evaluation: Time to Find Information to Prepare Briefing • Significant effect of document type • F(2, 16) = 5.62, p < .01 • RP-WEB> [UC-CD, PAPER] • Discussion • Effect on key problem • Only on browse time • No briefing preparation differences (when time to manually prepare slides added to PAPER condition • No interpretation time differences • Magnitude • RP-WEB 160s longer than UC-CD on a task which overall takes 1260 s (13% time increase) Applied Physics Laboratory/ Naval Research Laboratory

  39. DMARS Evaluation: Accuracy • Significant effect f document type F(2,16) = 3.48 , p =.055 • UC-CD > [PAPER, RP-WEB] @ p = .087 • Discussion • UC-CD errors on problems 4 and 5 due to omission of location selection • UC-CD errors on winds in Problem 1 due to user inexperience with new form of wind vector--only a short training period used. Applied Physics Laboratory/ Naval Research Laboratory

  40. : Workstation IT21 IT21 F : Video Camera IT21 T T T Example of At Sea Observations: USS CARL VINSON during Battlegroup training (COMPTUEX) Door Storage (document) Wind SMQ-11 (Satellite) NITES Server RAID Door F: Forecaster T: Technician O: Project Observer Printer O SMOOS NITES Server Desk SPA-25(Radar) NITES NT : Chair Desk FILE IT21 CCTV NITES NT : Port Hole CCTV NITES NT FILE O FILE NOW COPIER Coffee Applied Physics Laboratory/ Naval Research Laboratory

  41. At Sea Observation Methodology • Office Environment • Observed forecasters workflow (not-interfering basis) • Two watches a day (12 on, 12 off) • Each session is about 12 hours long • Two Observers for each watch; two observations per watch • Equipment • Three video cameras • Note taking • Palm Pilot for recording the timing of the task Performance • Questionnaires • Interviews Applied Physics Laboratory/ Naval Research Laboratory

  42. Future Work Summary • Workshop • Task Analysis • Design recommendations • Evaluation Applied Physics Laboratory/ Naval Research Laboratory

More Related