1 / 40

Running a Successful Beta Test Program

Running a Successful Beta Test Program. Adam Long (CTO) Darren Gage (QA Mgr) Sam Soubra (Dev Mgr) September 2010. Agenda. About Us Running a Beta Test Program Runtime Intelligence Service Overview Code Demo Questions. About our Company. About our Software.

Download Presentation

Running a Successful Beta Test Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Running a SuccessfulBeta Test Program Adam Long (CTO) Darren Gage (QA Mgr) Sam Soubra (Dev Mgr) September 2010

  2. Agenda • About Us • Running a Beta Test Program • Runtime Intelligence Service • Overview • Code Demo • Questions

  3. About our Company

  4. About our Software • Qualitative Data Analysis • Unstructured Data • Documents • Audio / Video • Images • Spreadsheets • What MS Excel does for structured numerical data, we do for unstructured non-numerical data

  5. About our Software • NVivo 9 • .NET 3.5 • Win Forms • WPF • WCF • SQL Server 2008 R2 Express • NVivo Server 9 • Silverlight 3 • SQL Server 2008 R2 Standard

  6. Beta Test What Why Validate objectives Market Features Quality Improve outcomes Collaboration / Innovation Assist marketing Quotes / Case Studies Plan future direction Opportunities • Beta testing is conducted by a sample of customers to identify flaws in feature complete software • Generally occurs after alpha testing (unit & system)

  7. 7 Steps of a Beta Test Program

  8. 1. Plan • Scope • Goals & Success Criteria • Approach • Open vs. Closed • Responsibilities • Management, Communications, Defects, Suggestions • Schedule • Balance between having enough features available to test versus enough time to incorporate feedback

  9. 2. Promote • Internal • Sales, Marketing, Training • External • Existing Customers • Potential Customers • Website & Social Media announcements • Customer Newsletter • Email Select Customers • Best Beta Testers • Interested • Independent • Vocal • Avoid pressure to include • Sales Leads • Evaluators

  10. 3. Select • Online Application • Outline Program Expectations • 20 Questions • Technical Environment • Market Segment • Research Background • Competitor Software • Motivation • Result • 523 Applicants • 78 Testers Selected • Selection Process • Top Previous Beta Testers • Top Forum Posters • Subject Matter Experts • Random Applicants • Simulation Program to Optimize Spread of Answers to Questions • Obtain Signed Agreement

  11. 4. Collect • Online Webcasts (Citrix GoToWebinar) • 5 recorded videos & 88 feedback forms • Online Forums (IP.Board) • 412 suggestion posts & 335 defect posts • Online Survey (Survey Monkey) • 58 survey responses • Application Analytics (RIS) • 2,448 hours of usage statistics

  12. 5. Analyse • Identify • Defects • Suggestions • Understand • Most liked features • Most disliked features • Most important improvements needed • Dogfooding • Used NVivo Beta to analyse feedback • Videos • Forums • Surveys • In conjunction with • Application Analytics

  13. 6. Prioritize • Prioritize Suggestions • 1 Must Address for RTM • 2 Preferably Address for RTM • 3 Consider for Service Pack • 4 Defer for Future Release • 5 Rejected • Raise Defects in Bug Tracking

  14. 7. Recognize • Active Participants (85%) • Activated the software & provided some feedback • Average of 37 hours testing over 4 weeks • Receive free copy of software • Top 20 Participants (25%) • Also receive Amazon voucher

  15. Beta Test Tips • Allocate sufficient time & resources • Observe usage patterns using Application Analytics and encourage testers to explore specific features • Reprioritize effort based on final feedback • Hold weekly status meetings • Communicate outcomes to internal stakeholders

  16. Runtime Intelligence – What is it? • The purpose of runtime intelligence as a component of SDLC management is to support better development, support and R&D investment decision making. • Leverages technologies, managed services and practices for the collection, integration, analysis, and presentation of application usage levels, patterns and practices

  17. Runtime Intelligence – Why use it? Gauge Effectiveness of the beta test Analysis Phase Collection Phase Platform usage Beta tester effectiveness Feature usage

  18. Runtime Intelligence – Why use it? Improve Decision Making Prioritization Phase Analysis Phase Understand Quality Usage context while Prioritising feedback Usage context while Analysing feedback

  19. Runtime Intelligence – Tool Choice • PreEmptiveRIS • Features matched those in competitor’s offerings • We had used other PreEmptivetools (Dotfuscator) • Integrated into Visual Studio 2010 • Other tools out there; • EQATEC Analytics • Gibraltar Software

  20. Runtime Intelligence – Architecture 3 2 1 1 Instrumented App is run Usage data sent to Data Repository 2 3 Dashboard and Reports

  21. Runtime Intelligence – Implementation

  22. Runtime Intelligence – Services Yours • Runtime Intelligence endpoint • Runtime Intelligence repository • Aggregate, validate, manage, & publish • Managed or on-premise • For internal dev. and multi-tenant constituencies • Runtime Intelligence • Reports & Dashboards • Owner, application, feature, user, & custom • RESTful API and export • IDE, CRM, ERP, & IT Operations Preemptive Cloud Publish Manage Validate Aggregate

  23. Runtime Intelligence – Reports and Dashboard • PreEmptivesRIS Portal

  24. Runtime Intelligence – Export • Data is rich • You will need to spend time to mine it!

  25. Runtime Intelligence – Export

  26. Runtime Intelligence – Export

  27. Runtime Intelligence – Where to next? • Move into Production • Improve Development Decision Making for future releases

  28. Code Demo Dotfuscator by PreEmptive Solutions • Runtime Intelligence defines several message types: • Application and Session Start • Application and Session Stop • Feature • Performance Probe • System Profile • Tamper Detected

  29. Runtime Intelligence Portal’s dashboards To have your application send these messages, you must: • Be a Runtime Intelligence Service subscriber • Activate Runtime Intelligence from within Dotfuscator • Attribute your application with Runtime Intelligence attributes, including Setup and Teardown • Run your application through Dotfuscator with the Send Analytics Messages option turned on

  30. Business Attribute

  31. Application Attribute

  32. Setup Attribute

  33. Teardown Attribute

  34. Feature Attribute

  35. Sample Application

  36. Sample Code

  37. ILDASM Before

  38. ILDASM After

  39. Analytical Data

  40. Questions?

More Related