270 likes | 452 Views
User Acceptance Testing The Hard Way. Graham Thomas BCS SIGIST 10 th May 1996. CONTENTS. Background Test Method Test Environment Test Execution Implementation Measures of Success Lessons Learnt. BACKGROUND. The Project Project Structure The Environment Start Point. The Project.
E N D
User Acceptance TestingThe Hard Way Graham Thomas BCS SIGIST 10th May 1996
CONTENTS • Background • Test Method • Test Environment • Test Execution • Implementation • Measures of Success • Lessons Learnt
BACKGROUND • The Project • Project Structure • The Environment • Start Point
The Project • Link 3 computer systems • Sales & Marketing • Registration • Billing • In 3 separate business areas • With 3 different product lifecycles • Supported by 3 competing suppliers
TEST METHOD • Method • Test Planning • Test Analysis • Test Scripting • Data Definition
Test Planning • Plans • Pre-determined end date • Stress & volume testing required • Re-usable test environment to be built • Users want to see bills produced • Resources • 2 testers for 10 weeks • 1 strategist for 10 days
Test Planning (2) • Proposed Strategy • Structured testing - driven by User Requirements Spec. • Involve User Representatives • Data Tidy & User Procedures to be in place for test execution • Build a regression test environment • Extra time required • Additional resource required
Test Analysis • Requirements Spec • A technical document • Not understood by users • Not understood by testers • Technical Design Spec’s. • Written by individual suppliers • Difficult to interpret without access to system design docs.
Test Analysis (2) • Requirements Spec rewritten in English • 200+ requirements extracted • Workshopped business scenarios • Business scenarios reviewed by suppliers
Test Scripting • Legacy systems had a lack of design documentation • Design documentation for enhancements not delivered • No one had knowledge of how all three systems would interface • Management only interested in the number of scripts, not their content
Test Scripting (2) • Mgmt. view that Test Team could not‘Cut the mustard’ • Suppliers view‘only they could test their systems’ • Brought members of suppliers’ development teams on board • Suppliers not paid until completion of testing
Data Definition • User Representatives limit their involvement to a review capacity • Pragmatic decisions taken to: • Generate test data from limited set supplied by User Reps. • Satisfy more than one requirement with a single script • Reported this as a risk through to the Project Board
TEST ENVIRONMENT • Determine requirements • Specify environment • Then Get Real ! • Complete copy of production data for all three systems • Beg, borrow and steal ! • ‘Virtual Environment’
TEST EXECUTION • Problems • Problems, problems, problems . . . • Resource Requirements • Progress Monitoring
Problems • Delayed by late delivery of Code • Incident Reporting System Required • Test Harness didn’t work • Project Board intervention required to bring User Reps. back ‘On Side’ and commit more of their time • Changes !
More Problems • Additional testers but no accommodation, hardware or software • Systems Integration found wanting • System not stable enough to benefit from automation tools • Short term planning !
IMPLEMENTATION • Roll out plan • Three Days • Round Clock • Multi-site co-ordination • Power outage • Tape drive failure • Unforeseen system interaction
MEASURE OF SUCCESS • Objectives met • Suppliers view • Users change operating practice • Structured releases • Everything tested first • Full documentation produced
LESSONS LEARNT • Plan testing at project inception • Start testing early • Expect the worst • Gather metrics • Measure, Monitor & Manage • Be prepared to change • Testing is not Development contingency ! ! !