1 / 14

Evaluation of Human-Robot Interaction in the NIST Reference Search and Rescue Test Arenas

Evaluation of Human-Robot Interaction in the NIST Reference Search and Rescue Test Arenas Jean Scholtz Brian Antonishek Jeff Young Outline of Talk NIST Reference Search and Rescue Test Arenas Human-Robot Interaction (HRI) Challenges Case studies from USAR Competitions Methods Metrics

paul
Download Presentation

Evaluation of Human-Robot Interaction in the NIST Reference Search and Rescue Test Arenas

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of Human-Robot Interaction in the NIST Reference Search and Rescue Test Arenas Jean Scholtz Brian Antonishek Jeff Young Permis 2004

  2. Outline of Talk • NIST Reference Search and Rescue Test Arenas • Human-Robot Interaction (HRI) • Challenges • Case studies from USAR Competitions • Methods • Metrics • Guidelines • Recommendations Permis 2004

  3. NIST USAR Reference Test Arenas • Provides a repeatable way to evaluate a search and rescue system (robot + operator + human-robot interaction) • Score depends on • Number of victims located • Difficulty of arena in which victims are located • Accuracy of victim location • Accuracy of victim assessment • Penalties incurred in locating victims • Autonomy levels, HRI, platform mobility, sensor packages are left to the participants’ discretion Permis 2004

  4. Examples of NIST USAR Arenas Permis 2004

  5. USAR Competitions • Success depends on: • Mobility of platforms • Skill of the operator • Affordances, ease of use of the user interface • Sensor packages • Communications • System robustness • Currently we do not evaluate the various components separately but use the overall system performance in determining the winners of the competitions Permis 2004

  6. HRI Evaluation • Challenge: • To determine the contribution of the human-robot interaction design to the overall performance • And in the process, to develop both metrics for HRI and guidelines for the design • HRI • More than just the visual interface • Includes the design of the interaction dialogue between the robot(s) and the operator(s) Permis 2004

  7. Data Collection for HRI at USAR Competitions • Have collected data from 6 major competitions since 2002 • Offers wide range of HRI designs • Operators are robotics researchers, hence best case • Limited in our ability to interview/ control conditions • Data collected include: • Video of robot in arena (ground truth) • Video of what operator sees • Video of operator actions (in some cases) • Maps of coverage of arenas Permis 2004

  8. Data Analysis • Hypothesis: Systems that are able to cover more of the arena should be more successful • Analyzed % of time spent in • Navigation • Victim identification • Logistics • Failures • Looked for correlations between coverage, where time was spent, success in competition • More time spent navigating, more victims found • Correlation with coverage is difficult to compute; time between arenas, difficulty of arenas; difficulty in assessing Permis 2004

  9. Data Analysis, cont. • Human-robot awareness • The knowledge the human has of the location, status, and behavior of the robot • Indirect measures necessary • Used Critical Incident analysis • Global navigation • Local navigation • Obstacle evaluation • Vehicle state • Victim ID Permis 2004

  10. Data Analysis, cont. Permis 2004

  11. Data Analysis, cont. • What contributed to fewer critical incidents? • Local navigation • Frame of reference provided – overhead camera; 2 degree of freedom camera used to see wheels of robot in relation to environment • Obstacle encounters • Front and rear cameras • Ability to move robot and camera at same time • Vehicle state • Top down view of robot may have helped • Audio also helped (but noise in arena was excessive at times) • How did this correlate with success in competition? • Obstacle encounters were the best predictor but too little data to generalize Permis 2004

  12. Data Analysis, cont. • Robocup 2004 • Allowed us to compare overhead camera use with automatic mapping Permis 2004

  13. Guidelines for HRI Design • Information for effective situation awareness should include: • a frame of reference to determine the position of the robot relative to the surrounding environment • indicators of vehicle state, such as pitch, roll, traction indicators, indicators of sensor status, and camera positions relative to the robot body. • a map to provide global navigation information • Minimize the number of windows provided to the operator. • Provide a fused view of sensor information. • Support multiple robot operators in a single display. • Provide help from the robot in determining what mode of autonomy is most useful. Permis 2004

  14. Conclusions/ Recommendations • Awareness assessment provides insights about information needed by operators to avoid critical incidents • Indirect evaluation is problematic • Takes lots of resources to evaluate; hence cannot produce feedback for robotics researchers in timely fashion • Potential solution for more direct assessment • “compulsory figures” evaluation for USAR competitions • Place robots in a number of situations and measure time/accuracy needed for operators to assess and describe the situation • Eliminates execution of the situation (operator skill, platform mobility) • Could also provide a benchmark system for comparison Permis 2004

More Related