110 likes | 204 Views
Assignments. HMI Rosella Gennari 2012-2013. Usability Testing (1/3). Assignment: for a group of c.a 2-3 students: evaluation of an app like that of Trenitalia (or Bahn, or...) on a smart phone or tablet (Step 1) Personal study of the evaluation slides by the December 17 th
E N D
Assignments • HMI • Rosella Gennari • 2012-2013
Usability Testing (1/3) • Assignment: for a group of c.a 2-3 students: evaluation of an app like that of Trenitalia (or Bahn, or...) on a smart phone or tablet • (Step 1) Personal study of the evaluation slides by the December 17th • (Step 2) Planning: during the lab of December 21st,using the planning section of the structured report by the teacher: • collaboratively discuss and negotiate the planning, with clear and distributed roles for group member; ideally, • one team member observes and interacts with the user (when the user cannot complete the task alone after the expected maximum time) • and the other records the users’ actions/tasks, their timings, their order and comments in notes • write down the plan as structured in the report doc • submit it to the teacher by December 22nd
Usability Testing (2/3) • (Step 3) Run the evaluations before January 7th: • at the start of the evaluation • gather essential data concerning your users minding privacy, e.g., age range, expertise with the system, expertise with the app (better to have users with no expertise with the app), gender • explain your users the think-aloud and observation methods: • present your users the goal and starting point of the app • briefly explain them that you will be evaluating the app (not the user!), that you will assign them a task per time, and the think aloud method itself, briefly (see usability evaluation slides) • assign the users a task per time • During the evaluation, mind • the users’ uttered tasks • the timings of performed tasks • the uttered comments per task • and already try structuring your notes accordingly
Usability Testing (3/3) • (Step 4) Right after each evaluation, run a debriefing: the evaluators discuss the notes and cross check observations at critical points • (Step 5) Start collaboratively documenting your evaluation in the report doc during the lab of January11th: • appoint an editor in charge of revising the document before submissions • distribute work, e.g., each takes care of a session • in the report, clearly specify who wrote what and the editor • finish co-writing its sessions by January 13th • (Step 6) The editor submits the document to the teacher in pdf format by January 15th, 3 pm • (Step 7) By January 17th, starting from the report, • collaboratively write c.a 10~15 slides according to the following schema • decides who presents what • (Step 8) Present the slides during the lab of January 18th
PLANNING: GOAL • SPECIFY THE GOAL
PLANNING: EVALUATORS AND TYPE(S) OF USERS • Describe here the types of users (no names!) with essential data • gender, age range, expertise with system and app • you can also use the relevant parts of the context of use report (see homework) for describing them • Their number per type
PLANNING: MATERIAL • Describe the app, system and other possible material you used in the evaluation, e.g., Sony camera for recording
EXPERIMENT EXECUTION (DIARY) • Describe the execution of the experiment as in the report doc • Give screenshots of the data gathering activities, e.g., mobile phone, pic of the users (no face)
RESULT ANALYSIS AND VISUALISATION • Report the results for task success and timing (see performance metrics slides and report doc) • Report negative unique usability issues, per task, and plot the data if feasible (see performance metrics slides and report doc)
DISCUSSION • Discuss your results concerning performance metrics, e.g., • do they allow you to refine your types of users into classes of users? • Discuss your results concerning usability issues, e.g., • which design choices do they suggest?
WHAT YOU LEARNT • Be critical towards your own experience, that is, discuss the following in the team: • The data gathering activities: how did the users feel? What went wrong? What could you improve? • The data analysis activity: easy, difficult and for what, long,...