1 / 1

Graphical Gesture Tutorials for Mobile Touch-Screen Devices

Graphical Gesture Tutorials for Mobile Touch-Screen Devices. Procedure.

mitch
Download Presentation

Graphical Gesture Tutorials for Mobile Touch-Screen Devices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Graphical Gesture Tutorials for Mobile Touch-Screen Devices Procedure The small size of smartphone screens creates many challenges for application developers. One of the most significant of these challenges is communicating to users the gestures that are available on a given screen of an application. While there are a few studies on gesture hinting systems, there is very little academic research on graphical gesture tutorials for smartphone applications. Considering the use of graphical gesture tutorials by many very popular apps, this is a rather serious gap in the literature. This poster proposes research that would lessen the gap by using a commercially developed iPhone application as the basis for a study on graphical gesture tutorials for mobile touch-screen devices. More specifically, the poster explores how users respond to navigational gesture tutorials in a relatively simple iPhone application. The application is particularly interesting because it has no navigational menu. Thus in order for users to navigate from the home screen to the two adjoining screens, the users must swipe to either the right or the left. The lack of a navigational menu makes the design of the graphical gesture tutorial particularly important. The goal of the study is to determine the most effective, efficient, and aesthetically pleasing way to inform a user of how to navigate to and from the different screens of the application. Abstract Gesture Tutorials Background Screen Panning All mobile touch-screen devices have adopted background screen panning as a standard user interface feature (e.g. users swipe the background screen up in order to bring content from the bottom of the screen to the top). Johnson’s 1993 study on background screen panning set the foundation for widespread adoption of this interface [1]. Since the rise of mobile touch-screen devices, background screen panning has quickly become an ingrained expectation of users. Mauney notes that 70% of study participants who own a device that uses swiping gestures intuitively swipe up to scroll down. Conversely, only 50% of participants who own devices that use arrow keys or scroll bars to scroll down swipe down to scroll down [2]. Background screen panning is so ingrained that the gesture has transferred to desktop scrolling operations as well. Chen and Proctor have found that users prefer to scroll with the Mac Mountain Lion approach which uses the same swipe gestures as smartphones: swipe down to move up, swipe up to move down, swipe left to move right, etc. [3]. Using Gestures to Access Off-Screen Objects One of the most commonly referenced methods to notify users of off-screen objects is the Halo method, which surrounds key off-screen objects with rings that are partially viewable on the screen [4]. Burigat et al. compared the Halo approach to one that included different-sized arrows which pointed in the direction of the off-screen objects and found that the two methods did not have a significant difference on the user’s ability to access the off-screen object effectively and efficiently on a mobile device for simple task [5]. It is only as a task increases in complexity and requires spatial reasoning that the differences between Halos and arrows become noticeable [5]. For more complex tasks—tasks that include more screens and/or objects to navigate—arrows outperform Halos [5]. Gustafson has improved upon the work of Burigat el al. by creating Wedge, an approach that identifies the precise location of a piece or cluster of off-screen data without cluttering the screen [6, 7]. What is the most effective, efficient, and aesthetically pleasing way to inform a user of how to navigate to and from the different screens of a mobile application. • How does a user who intuitively understands background screen panning respond to a navigational menu that includes an arrow pointing off-screen to the left? • Does the user assume the arrow is telling him to swipe to the left, or does the user understand it to be a communication of an object off-screen to the left? Literature Review Methodology Design We plan to compare six types of interfaces in a between-subjects design user experiment. The interfaces include: directional, non-directional, and mixed. The directional interfaces use arrows to point to off-screen content. The non-directional interfaces include textual labels, graphical icons, and instructions. The mixed interfaces will include both directions and text. Interfaces Tasks We will provide twelve different information-seeking tasks for the participants to perform. These tasks are described using scenarios that involve the participants in the context of the information seeking, according to the concept of “work tasks” proposed by Borlund[8]. Figure 5 illustrates an example task in which the user is asked to access off-screen content using swipe gestures. In this scenario, the user is presented with a graphical gesture tutorial that includes arrows and text. The user must decide if the arrows refer to the swipe direction or the location of the off-screen content. Study participants will follow these seven steps: • The completion of an informed consent document which includes detailed instructions about the experiment. • The completion of a pre-study questionnaire. • A brief tutorial of the general touch-screen interface the participant will be using. • A brief practice session to assimilate the participant with the interface. • The completion of a pre-task questionnaire. • The task itself. • The completion of a post-task questionnaire. Participants will have up to 5 minutes to complete each task. The session will conclude with a brief post-task questionnaire. The same procedure will be followed for the remainder of the tasks, after which the exit interview is given. The data collected will be analyzed and then used to provide best practice solutions for creating gesture tutorials for mobile touch-screen devices. [1] Johnson, J., & Keavney, M. (1995). A Collection of Papers from FirstPerson, Inc. [2] Mauney, D., Howarth, J., Wirtanen, A., & Capra, M. (2010). Cultural Similarities and Differences in User-defined Gestures for Touchscreen User Interfaces. In Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems (pp. 4015–4020). [3] Chen, J., & Proctor, R. W. (2012). Up or Down: Directional Stimulus-Response Compatibility and Natural Scrolling. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 56, pp. 1381–1385). [4] Baudisch, P. (2004). Halo: A Virtual Periphery for Small Screens Devices. In Proceedings of International Conference on Advanced Visual Interfaces (AVI’04) (pp. 80–84). [5] Burigat, S., Chittaro, L., & Gabrielli, S. (2006). Visualizing Locations of Off-Screen Objects on Mobile Devices: A Comparative Evaluation of Three Approaches (pp. 668–668). Presented at the MobileHCI ’06, Helsinki, Finland: ACM. [6] Gustafson, S., Baudisch, P., Gutwin, C., & Irani, P. (2008). Wedge: Clutter-Free Visualization of Off-Screen Locations. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 787–796). [7] Gustafson, S. (2009). Visualizing Off-Screen Locations on Small Mobile Displays. [8] Borlund, P. (2003) The IIR evaluation model: a framework for evaluation of interactive information retrieval systems. Information Research, v. 8 no. 3. Contact Information Peter Allegretti pallegretti@albany.edu Mike Tanski mtanski@albany.edu Jenny Yuan xyuan@albany.edu In figure 1, off-screen content is identified but users are not told what gestures are required to access the content. In figure 2, users are instructed to swipe in specific directions but are not told exactly what content is located on the adjoining screens. Peter Allegretti, Mike Tanski, and Jenny Yuan Research Questions References Figure 3 – Multi-directional gesture interface tutorial using arrows and text Figure 4 – Graphical gesture interface tutorial using swipe representation Figure 1 – Lumos App by Doctored Apps Figure 2 – Groupon’s Mobile App Figure 5 – Example of an off-screen content fetching task using Lumos

More Related