1 / 0

Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites

Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites. Susan [Gardner] Archambault Kenneth Simon. Loyola Marymount University. Private Catholic University in Los Angeles, California 5900+ undergraduates and 1900+ graduates

diep
Download Presentation

Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites

    Susan [Gardner] Archambault Kenneth Simon
  2. Loyola Marymount University Private Catholic University in Los Angeles, California 5900+ undergraduates and 1900+ graduates William H. Hannon Library Information Desk open 24/5
  3. Research Question What is the most effective way to provide access to our Library FAQs? A comparison of two products: How Do I? and LibAnswers. Which features do students prefer, and which features lead to better performance?
  4. How Do I?
  5. LibAnswers
  6. Auto-Suggest Feature
  7. Related Questions Feature
  8. Methodology Conducted usability testing on 20 undergraduate students at LMU Population equally represented each class (freshmen through seniors) and had a ratio of 60:40 females to males
  9. Methodology Used a combination of the Performance Test methodology and the Think-Aloud methodology
  10. Methodology Students given 10 performance tasks to complete at a computer twice - once using LibAnswers as starting point, and once using How Do I? After each performance task, students given questionnaire measuring satisfaction with site
  11. Performance Task Questions
  12. Satisfaction Scale
  13. Methodology Audio recorded and computer screen activity captured via “ScreenFlow”screencastingsoftware
  14. Additional Questions How likely would you be to use each page again? What was your favorite aspect of each site? What was your least favorite aspect? Overall, do you prefer LibAnswers or How Do I?
  15. Performance Scoring: Speed Start the clock when the person begins searching for the answer to a new question on the home page of the site they are testing Stop the clock when they copy the URL with the answer
  16. Performance Scoring: Accuracy
  17. Performance Scoring: Efficiency Count the number of times the person made a new attempt, or started down a new path, by returning to the home page *after* a previous attempt away from or on the homepage failed
  18. Sample Scoring Videobit.ly/usabilityvideo
  19. Performance Results
  20. Performance Results
  21. LibAnswers Features Used
  22. Satisfaction
  23. Satisfaction
  24. Patterns Overall, 9 of 20 performed worse with the site they said they preferred. 4 of 5 freshmen performed worse with the site they said they preferred. Upperclassmen were more consistent. Females tended to perform better with their preferred site; males did not. 75% of the males preferred How Do I? over LibAnswers, while females were evenly divided.
  25. LibAnswers Likes Keyword search “like a search engine” Autosuggest in search bar Popular topics list Friendly / pleasant to use Don’t have to read through categories Dislikes Overwhelming interface / cluttered Long list of specific questions but hard to find the info you want Less efficient than the “How Do I” page Once you do a search, you lose your original question Autosuggestions are ambiguous or too broad, and sometimes don’t function properly
  26. How Do I? Likes Fast / efficient to use Everything is right there in front of you: “I don’t have to type, just click” Simple, clearly laid out categories Organized and clean looking Dislikes Less efficient than the LibAnswers page: have to read a lot Too restricted: needs a search box Have to guess a category to decide where to look Limited number of too-broad questions Boring / basic appearance
  27. Sharing results with Springshare Retain question asked in search results screen. Add stopwords to search, so typing “How do I” doesn’t drop down a long list of irrelevant stuff, and “Where is” and “where are” aren’t mutually exclusive. Remove “related LibGuides” content to reduce clutter. Control the list of “related questions” below an answer: they seem to be based only on the first topic assigned to a given question.
  28. Take the best of… How Do I
  29. Take the best of… LibAnswers
  30. But wait… There is another.
  31. Take the best of… Get Help
  32. The best of all worlds
  33. Conclusions Ended up with a balance between two extremes rather than one or the other Think-aloud method: gave up control; no preconceived ideas could influence outcome Sitting in silence watching the participants made them nervous. Next time maybe leave the room and have a self-guided test Efficiency is difficult to measure: moved away from counting clicks
  34. Acknowledgements Thank you: Shannon Billimore Jennifer Masunaga LMU Office of Assessment/Christine Chavez Springshare
  35. Bibliography Ericsson, K.A. and Simon, H.A. (1980). Verbal Reports as Data. Psychological Review, 87(3), 215-251. Smith, Ashleigh, Magner, Brian, and Phelan, Paraic. (2008, Nov. 20). Think Aloud Protocol Part 2. Retrieved May 3, 2012 from http://www.youtube.com/watch?v=dyQ_rtylJ3c&feature=related Norlin, Elaina. (2002). Usability Testing for Library Web Sites: A Hands-On Guide. Chicago: American Library Association. Porter, J. (2003). Testing the Three-Click Rule. Retrieved from http://www.uie.com/articles/three_click_rule/. Willis, G.B. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage Publications.
  36. Additional Information Presentation Slides Contact Us Ken Simon Reference & Instruction Technologies Librarian Loyola Marymount University Twitter: @ksimon Email: ksimon@me.com Susan [Gardner] Archambault Head of Reference & Instruction Loyola Marymount University Twitter: @susanLMU Email: susan.gardner@lmu.edu bit.ly/gardnersimon
More Related