1 / 14

Jim Dolbear Director, Voting Systems Institute

Session 8 – Best State Practices “Best Practices, Quality Systems, Customer Requirements and Testing as Applied to Voting”. Jim Dolbear Director, Voting Systems Institute. dolbear@votinginstitute.org 310.899.3800 ext. 163 9343 Culver Blvd. Culver City, CA 90232.

louisbeck
Download Presentation

Jim Dolbear Director, Voting Systems Institute

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Session 8 – Best State Practices“Best Practices, Quality Systems, Customer Requirements and Testing as Applied to Voting” Jim Dolbear Director, Voting Systems Institute dolbear@votinginstitute.org 310.899.3800 ext. 163 9343 Culver Blvd. Culver City, CA 90232

  2. Examples of Best Practices for Voting • Quality systems and product design from industry. • The ITA’s bring some of this background to voting. • Auditing procedure and reporting culture from FASB and corporate accounting. • Technical development models from academia and science. • NIST brings this experience to voting. • Inclusive, non-governmental standards setting from the internet and IETF (the “Internet Engineering Task Force”). • Performance testing and review from the automotive industry.

  3. Defining Customer Needs or Requirements/Defining the Performance Objectives for a System • House of Quality concept. From the U.S. to Japan and back again. • Deming, Crosby and Nash. • Toyota Production System. Taichii Ohno and Shigeo Shingo. • Now they talk about “Six Sigma” projects. • Focus on the customer. • From customer needs to technical requirements. • An example from the car industry. • Defining requirements and then testing for voting systems is similar.

  4. Defining and Achieving Customer Requirements

  5. Requirements, Specifications, Testing • What are we trying to test in voting? Talking about testing in a vacuum is putting the cart before the horse. • Need to know how we would want a voting system to perform before we can test it to see if it passes muster. • First, requirements tell us how we need a product, process or system to perform functionally. • Sometimes requirements must reflect practical compromises and don’t include everything we desire because of technical and resource constraints. • Minivans only go so fast. A requirement might be 0-60 acceleration in 8.5 seconds. A desire would be 0-60 in 6.5. • Specifications are typically technical definitions of materials, procedures, and structures that must be used to achieve requirements. • Tests (and inspection) are done to evaluate whether the requirements are in fact being met by the system.

  6. Testing and Product Design and Customer Requirements – a Virtuous Circle of Improvement • National Highway Traffic Safety Administration develops test and begins to test and report crash results. • Vendors focus on achieving better results. • Customers realize that safer cars can be made and begin to value safety in their purchase decisions. • Safety improvements over a 30-year period result in much safer cars, higher profits for manufacturers of safe cars and pleased, safe drivers as customers.

  7. Voting Product Design: User Interface Requirement and Testing • Define an objective test to measure good and bad performance. • Say 100 people interpreting 10 different ballot styles. • Measure their comprehension and error rate. • Apply the test to different systems -- paper-based and touchscreen. • Provide the basis for scientifically evaluating the performance of paper. • Report the results to support a choice based on cost and performance.

  8. Product Design: User Interface Requirement and Testing

  9. Absolute Requirements and Requirements which must be Traded off Designing a Car • Example/Absolute: Seatbelts are required by law. • Example/Tradeoff: zippy acceleration vs. seating for seven Designing a Voting System • Absolute Voting Requirement: Providing access for the blind. • Absolute Voting Requirement: State ballot style laws. • Tradeoff: Exposure to vulnerability of computers vs. ease of automated tabulation and ballot creation • Tradeoff: Auditing protocols vs. operational complexity • Tradeoff: sometimes privacy vs. security

  10. Importance of Formally Defining Requirements • A “Quality System” provides for the formal “archival” definition and documentation of customer requirements and corresponding technical product and process requirements/specifications. • It’s particularly important to document the dependency relationships between different, conflicting requirements. • Let’s get rid of this heavy bumper and crumple zone to make the car go faster. • Let’s use computers for voting so different ballot formats and tabulation will be much easier. • State code typically defines some or many of the customer requirements for voting. The testing protocols should provide for technical evaluation of achievement of these requirements.

  11. The Future: Encouraging New Technological Developments • Meet absolute requirements more effectively at lower cost. • Reduce or eliminate the need for tradeoffs. • Lay out a plan for participating with development partners and encouraging innovation. • Example/Aluminum Crumple Zones: Lighter and Stronger! • Encrypted Receipts: Improved privacy and security? • Open Source Voting Systems: Cheaper and More Reliable?

  12. Technical Design Improvements

  13. Performance Specifications or Requirements vs. Design Specifications • Voting systems should perform at a certain level for each critical customer requirement. Testing this performance objectively is the aim. • States should not specify or implicitly require particular designs. • 5-star crash test does not test the gauge of the steel in the frame. • If a crumple zone gets 5 stars with lighter steel; it’s still 5 stars. • Voting System Performance Rating (www.vspr.org) • Bring together all stakeholders -- election officials and experts, vendors, computer and security professionals and social scientists and advocates. • Define performance attributes for voting systems. • Develop and define objective tests to evaluate performance of systems. • How secure • How private • How transparent

  14. State Best Practices - Process • Publish and meet all absolute requirements for voting systems as laid out in Federal and State law. • Involve all stakeholders and make responsible, informed tradeoffs where a choice must be made between performance objectives. • Over-communicate the rationale for choosing particular solutions particularly where there are tradeoffs. • Why punchcards? Low cost. Secure. Auditable. • Why computerized voting? Access. Ballot creation. Tabulation. • Actively participate in and sometimes lead the development of higher performing/cheaper solutions. • Early voting, provisional voting and absentee voting as test beds. • Work with counties to develop, nurture and learn from different approaches.

More Related