1 / 23

Usability Requirements: Compliance with ANSI/INCITS-354

Usability Requirements: Compliance with ANSI/INCITS-354. Stephen Jackson Cheryl Kieliszewski. Abstract.

Download Presentation

Usability Requirements: Compliance with ANSI/INCITS-354

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability Requirements: Compliance with ANSI/INCITS-354 Stephen Jackson Cheryl Kieliszewski

  2. Abstract ANSI standard ANSI/INCITS-354 (Common Industry Format (CIF)), adopted in 2001, provides a standard format for the sharing of usability-related data. As a newly ratified standard, the CIF has yet to gain industry-wide support, and is still being evaluated for roll-out within IBM. However, adoption of the CIF within IBM is important for several reasons. Several large companies (both competitors and customers) support the CIF, which may become a requirement for sales (similar to the Government section 508 Accessibility requirements). Early adopters of the CIF are Boeing, Kodak, Oracle Corporation, and State Farm Insurance. The CIF will lead to improvements in the User Centered Design (UCD) process within IBM through the standardization of reports across teams and products. The CIF will also become a necessity to maintain competitiveness (e.g., Oracle Corporation). The poster will provide a history and requirements of the CIF document, IBM corporate strategy regarding the CIF, a comparison to an existing process, UCD process improvements and the benefits to IBM.

  3. What is the ANSI Standard ANSI/INCITS-354 Common Industry Format (CIF)? • Usability standard for direct comparison between competitive products • Most likely performed by an independent testing organization • Conducted after a product is released • Compared with competitive product for usability • Evaluate comparison and weigh differences for purchasing decisions • Help procurement and purchasing for large companies • Document audience is primarily usability experts • The CIF does not tell you what to do; it tells you how to report on what you did • CIF has a dual nature—highlights both product strengths and weaknesses

  4. History • 1996 – NIST recognized the need to: • Encourage software suppliers and consumer organizations to work together to understand user needs and tasks. • Develop a common usability reporting format for sharing usability data with consumer organizations. • Conduct a pilot trial to determine how well the usability reporting format works and to determine the value of using this format in software procurement. • Keith Butler (Boeing) started and drove the standards work group • IBM UCD Advisory Council and Microsoft have provided feedback to the core CIF team during development of the standard • Ziff Davis Publishing wanted to make this a usability seal of approval, but that idea was denied • Anticipate a government tie into the ANSI standard (like accessibility requirements) • Being considered for inclusion in ISO Standard 9241 (Usability Standard)

  5. Support of CIF and Early Adopters • Include user quotes here • Boeing • Kodak • Oracle • State Farm Insurance • Microsoft • HP

  6. Title Page Executive Summary Introduction Full Product Description Test objectives Method Participants Context of Product Use in Test Experimental Design Usability Metrics Results Data Analysis Presentation of the Results References Appendices Standard Report Format

  7. Title Page • Identify the report as a Common Industry Format (CIF) document • State the CIF version • State contact information (i.e., ‘Comments and questions: iusr@nist.gov’) • Product name and version/release tested • Research lead and contact information • Date(s) test was conducted • Date the report was completed • Report author

  8. Executive Summary • Provides high level summary of the test • Intent is to provide information for procurement decision-makers in customer organizations • Identify and description of product • Summary of the method(s) used in the test • Results expressed as mean scores or other suitable measure of central tendency • Reason for and nature of the test • Tabular summary of performance results

  9. Introduction • Full Product Description • Formal product name and release or version • What parts were evaluated • Intended user population • Any groups with special needs • Brief description of environment the product should be used in • The work that is supported by the product • Test Objectives • Describes objectives for the test • Functions and components the user directly or indirectly interacted with during the test • Whether or not the function or component tested was a subset of the total product. If so, provide reason for testing subset

  10. Methods • Key technical section of the report • Must provide enough information to allow an independent tester to replicate the procedure used in testing • Participants • Description of the user population and test sample • Total number of participants tested • Segmentation of user groups tested • Key characteristics and capabilities of the user groups • How participants were selected and if they met essential characteristics and capabilities • Whether or not the participant sample included representatives of groups with special needs.

  11. Methods • Context of Product Use in the Test • Description of the tasks, scenarios and conditions in which the test was performed • Tasks • Description of task scenarios used for testing • Explanation of why the scenarios were selected • Description of the source of tasks • Description of task data provided to the participants • Completion or performance criteria established for each task • Test Facility • Physical description of the test facility • Details of relevant features or circumstances that may affect the quality of the results (e.g., recording equipment, one-way mirrors) • Participant’s Computing Environment • Software and hardware configuration details, display details, audio details, and/or manual input details • Test Administrator Tools • Describe any hardware or software used to control the test or record data • Describe any questionnaires used to collect data

  12. Methods • Experimental Design • Describes the logical design of the test • Procedure • Provide independent or control variables, operational definitions of measures, and any policies or procedures for task time limits, training, assistance, intervention or responding to questions. • Provide sequence of events from greeting to dismissing participants • Provide steps the evaluation team followed to execute the study and record data and the roles they played • State details of non-disclosure agreements, informed consent/human subjects rights, and compensation • Usability Metrics • Explain what measures have been used for each category of usability metrics: completion rates, errors, assists, time-on-task, completion rates, satisfaction ratings

  13. 3 Required CIF Usability Metric Categories • Effectiveness: Empowering users to succeed in their tasks • Efficiency: Enabling people to work faster to save time and money • Satisfaction: Reducing frustration and under-utilization • 3 Additional IBM Usability Metric Categories • Flexible: Allowing people to work in ways that match their situation • Easy to learn: Reducing time to value with or without training • Safe: Preventing accidents and business errors

  14. Results • Second major technical section of the report • Describes how the data were scored, reduced and analyzed and provides the major findings in quantitative formats • Data Analysis • Provide sufficient detail to allow replication of data scoring, data reduction, and analysis methods by another organization • Presentation of the Results • Required to report effectiveness, efficiency, and satisfaction results in tabular and graphical presentations to describe the data

  15. Appendices • Detailed study materials • Customer questionnaires, participant general instructions, participant task instructions, release notes.

  16. References Common Industry Format for Usability Test Reports (version 1.1, October 28, 1999). Available from iusr@nist.gov P. Englefield (personal communication, June 6, 2003) D. Gonzalez (personal communication, April 25, 2003) E. Reinke (personal communication, June 9, 2003) K. Vredenburg (personal communication, May 13, 2003)

  17. Revenue Comparison • Blah, blah, blah

  18. Improvements in Usability Coverage • Usability tests designed to meet CIF requirements • Standardizes UCD reporting • Standardizes core set of UCD metrics • Provides a standard means to measure competitive products’ usability

  19. Benefits and Competitiveness • Improvements to UCD process will help drive user-friendly product that meet user requirements. • CIF will provide a yardstick to compare our usability to the competition – highlight areas that we can improve or exceed the competition. • Usability is a product differentiator.

More Related