200 likes | 398 Views
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface. Katherine M. Tsui and Holly A. Yanco University of Massachusetts Lowell. http://www.cs.uml.edu/robots. Outline. Collaborators Research Question Hardware Experiment Current/Future work. Collaborators.
E N D
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsuiand Holly A. Yanco University of Massachusetts Lowell http://www.cs.uml.edu/robots
Outline • Collaborators • Research Question • Hardware • Experiment • Current/Future work
Collaborators • University of Central Florida: Aman Behal • Crotched Mountain Rehabilitation Center: David Kontack • Exact Dynamics: GertWilem Romer • NSF IIS-0534364
Research Question • What is the most effective user interface to manipulate a robot arm? • Our target audience is power wheelchair users, specifically: • Physically disabled, cognitively aware people. • Cognitively impaired people who do not have fine motor control.
Manus ARM by Exact Dynamics 6 DoF Joint encoders, slip couplings Cameras Manual and computer control modes Both are capable of individual joint movement and Cartesian movement of the wrist. Hardware
Interface Design • Interface is compatible with single switch scanning. • Left: • Original image is quartered. • Quadrant containing the desired object is selected. • Middle: • Selection is repeated a second time. • Right: • Desired object is in 1/16th close-up view.
User Testing: Hypotheses • H1: Users will prefer a visual interface to a menu based system. • H2: With greater levels of autonomy, less user input is necessary for control. • H3: It should be faster to move to the target in computer control than in manual control.
User Testing: Experiment • Participants • 12 able-bodied participants (10 male, 2 female) • Age: [18, 52] • 67% technologically capable • Computer usage per week (including job related): • 67% 20+ hours; 25% 10 to 20 hours; 8% 3 to 10 hours • 1/3 had prior robot experience: • 1 industry; 2 university course; 1 “toy” robots
User Testing: Experiment Methodology • Two tested conditions: manual and computer control. • Input device was single switch for both controls. • Each user performed 6 runs (3 manual, 3 computer). • Start control was randomized and alternated. • 6 targets were randomly chosen.
User Testing:Experiment Methodology • Neither fine control nor depth existed in implementation of computer control during user testing. • In manual control, users were instructed to move the opened gripper “sufficiently close” to the target.
User Testing:Experiment Methodology • Manual control procedure, using single switch and single switch menu: • Unfold ARM. • Using Cartesian movement, maneuver opened gripper “sufficiently close” to target.
User Testing:Experiment Methodology • Computer control procedure: • Turn on ARM. • Select image using single switch. • Select major quadrant using single switch. • Select minor quadrant using single switch. • Color calibrate using single switch.
User Testing: Results H1: Users will prefer a visual interface to a menu based system. • 83% stated preference for manual control in exit interviews. • Likert scale rate of manual and computer control (1 to 5) showed no significant difference in user experience preference. • H1 was not proven. • Why? Color calibration
User Testing: Results H2: With greater levels of autonomy, less user input is necessary for control. • In manual control, counted the number of clicks executed by users during runs, divide by run time. This yields average clicks per second. • In computer control, the number of clicks is fixed. • H2 was confirmed.
User Testing: Results H3: It should be faster to move to the target in computer control than in manual control. • Distance to time ratio: moving distance X takes Y time. • Under computer control, ARM moved farther in less time. • H3 was confirmed.
Current/Future Work • Identify specific volunteers • User interface • User testing: • H1 • Baseline evaluation • Initial testing at Crotched Mountain • Integration with power wheelchair • Depth extraction • Occlusion