1 / 6

Sensing & Mobility

Ken Hinckley Microsoft Research Collaborators: Jeff Pierce, Mike Sinclair, Eric Horvitz. Sensing & Mobility. Disclaimer: Opinions are Ken’s only and may not reflect the views of Microsoft, or anyone else for that matter . Sense more than just explicit commands

oshin
Download Presentation

Sensing & Mobility

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ken Hinckley Microsoft Research Collaborators: Jeff Pierce, Mike Sinclair, Eric Horvitz Sensing & Mobility • Disclaimer: • Opinions are Ken’s only and may not reflect the views of Microsoft, or anyone else for that matter 

  2. Sense more than just explicit commands Background info (presence, sounds, physical contact, …) can be sensed & exploited Simplify & enhance the user experience via better awareness of context Background Sensing: Can it Live Up to its Promise?

  3. What are the unique input challenges for mobility? Mobile devices used under far more demanding conditions than desktop (e.g while driving) Even “click on button” difficult due to attentional demands Provide services/enhancements the user would not have cognitive resources or time to perform. Device automatically senses what it needs? Push? Getting text into device? Communication/read-only? Environs changing, but what properties to sense? Do you need lots of sensor fusion to do anything useful? Sensing only useful for small, task-specific devices? Sensing: What’s Unique about Mobile Devices?

  4. Example: Tilt Sensor Data from PalmPC (walking around)

  5. Dilemma: Intentional Control vs. Cognitive / Physical Burden • Button-click  Touch  Hand-near-device • less intentional control, more SW inferential burden • Use these “problems” to your advantage? • Decrease cognitive burden to make decisions • {Touching, …} is something the user must do anyway? • Benefit vs. cost of errors (intent / interpretation) • Is sensing necessarily annoying if “wrong”? • How to quantify penalty of failure? • Perhaps the key is designing for graceful failure? • Need to encourage mental model of what’s sensed?

  6. Few interaction techniques, almost no studies What do users expect? Do they like sensing? Do they care? How to overcome false positives / negatives? Evaluative / scientific approach to sensing UI’s? Automatic action vs. user control & overrides Limits? Where is explicit user input necessary? “Special cases” & complexity of mobile environments may confuse sensors / override with noise What are some issues / tradeoffs? Quick access vs. inadvertent activation of features Sensor & display quality vs power consumption cost, weight, features vs. UI complexity, … Background Sensing: Some Open Issues

More Related