1 / 37

TETRA. Technical Debt Reduction Platform

The quality of a software product is seen as the amount of various technical debts embedded into the product and delivered to end-users, which we can define and measure with the aim to eliminate or at least minimize it. The technical debt is broadly defined as a combined product non-compliance with technical guidelines and business objectives that negatively impact business results. Eight categories of technical debt (and product quality) are classified. Crucial metrics and its acceptable values are identified for each category. As result of this comprehensive, more than 6000 hours, research, a Technical Debt Reduction Platform called TETRA was build. It has become a proprietary practice to measure and control technical debt and assess software products.

Download Presentation

TETRA. Technical Debt Reduction Platform

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TETRATechnical DebtReductionPlatform See your product from inside

  2. What is Technical Debt? Technical debt,according to the industry best practices,is any code added now that will take more work to fix at a later time;  typicallywith the purpose of achieving rapid gains. We at Intetics consider technical debt as a combined product non-compliance with technicalguidelines and businessobjectives that negatively impact business continuity.

  3. The TETRA Story Tetra is a tiny fish you can see in every aquarium named so because of four (tetra = four in Greek) distinctive fins. It’s a nice, small fish with thousands of various species available. It’s like Technical Debt of your software product consists of that small, insignificant, sometimes “locally beautiful” trade-offs of quick solutions that are easy to implement instead of applying the overall best solution. Exactly like with financial debt, when technical debt piles up you will pay an “interest” in the form of harder maintainability, mediocre user experience, development team productivity declines, and overall higher costs. The more tetras are in your aquarium, the more fodder you need and more often you need to clean it up.  Inteticsput more than 6,000 hours of original research into TETRAdevelopment.

  4. Multiple Faces of Technical Debt Technical Debt on a boring techie slang is …Quality Quality: • Defines the level of customer satisfaction • Measures excellence or a state of being free from defects, deficiencies and significant variations • Means meeting the requirements of the customer Intetics sees quality as a delivered level of a software product and processes measured bybusiness critical metrics.

  5. TETRA - TECHNICAL DEBTREDUCTIONPLATFORM

  6. Who benefits from TETRA? Who needs to TETRA a Software Product? Investors Software Owners • To evaluate the technical state of a product for purchase • To better define the fair market value of a product • To better asses investment risks of the intended transaction • To measure amount of technical debt (aka ‘quality levels’) of a product • To get objective insights for decision making and priority setting  • To improve software processes and increase proficiency of the development team

  7. TETRAelementsTETRA assesses product per eight (two tetras!) dimensions • Source code quality – assessment of quality of product source code by a number of critical parameters • Usability, UI & Documentation – evaluation of product user interface, usabilityand documentation • Security– valuation of product vulnerabilities according to CVE, CERT, CWE, OSVDB, OWASP, and BID • Performance– measurement of product performance and load parameters • Business logic – audit of how product fits in to business objectives • Architecture quality – expert assessment of product software core structure and data model • Data quality – check of product resistance to bad data, exception handling, and bad data prevention • Open Source code use – detection of use of open source and other 3rd party code

  8. SOURCE CODE QUALITY ASSESSMENT

  9. Source Code Analysis Key source code quality metrics* • Cyclomatic Complexity (methods, classes, files) - is a measure of the number of linearly independent paths through a program module. • Duplications (lines, blocks, files) - Number of physical lines touched by duplications. • Code coverage (lines, branches) - Percent of source code covered by unit tests. Unit tests help test individual units of the source code to evaluate whether they are fit for use. • Rules compliance – Compliance of source code to a coding standard and best practices of a particular programming language. • SQALE - Software Quality Assessment based on Lifecycle Expectations, is a method to support the evaluation of a software application source code. * The list of metrics was established on the basis of best practice analysis, Intetics original research and vast expertise acquired during multiple projects. All the mentioned metrics affect the probability of defect appearance. The dependence example is displayed in the graph.

  10. Intetics Source Code Quality Management Platform Version Control Systems System Sonar Runner Checkstyle Effect on Source Code Quality • Automated analysis and code review • Enforcing of coding standards • Centralized quality metrics repository • Dramatically improved source code quality Effect on Team Proficiency • Best practice sharing • Automated RCAs • Individual corrective actions • Remarkable improvement of team proficiency Hudson PMD / CPD JUnit Squid SVN Git Jacoco Findbugs Code source Java, Cobol, VB PL/SQL, Flex, PHP, C Sonar web interface Sonar DB

  11. Code Quality Assessment Report • General Source Code Quality level description: • Perfect (A)– No Problems • Good (B)– There are low-priority issues that aren’t very important, but should be fixed • Medium (C)– Mediocre quality of source code; requires refactoring • Low (D)– Low quality code, needs refactoring and fixing of all found issues • Critical (E)– It is imperative to fix all issues NOW. High impact problems that often prevent a user from correctly completing a task. May require re-development.

  12. UI, USABILITY AND DOCUMENTATION ASSESSMENT

  13. Usability Assessment Is based on Valuation Testing List of Defects Valuation of Documentation evaluation • Learnability • Memorability • Efficiency • Error tolerability • Likeability Expert assessment based on the developed documentation analysis system About 130 various tests applied depending on product complexity Quantity and severity of bugs provide additional insights * The list of metrics was created on the basis of best practice analysis, Intetics original research and expertise acquired during multiple projects.

  14. Result Summary Usability is the ease of use of a software product. After respective testing, the Usability Report provides an objective feedback on how effectively the product fills users’ needs.

  15. SECURITY ASSESMENT

  16. Software Product Security Assessment Security – valuation of product vulnerabilities according to • CVE - The Common Vulnerabilities and Exposures • CERT - The Computer Emergency Response Team for the Software Engineering Institute • CWE - The Common Weakness Enumeration • OSVDB - The Open Source Vulnerability Database • OWASP - The Open Web Application Security Project • BID - Bugtraq ID, a list of security vulnerabilities Severity level for every issue is based on CVSS, The Common Vulnerability Scoring System, for each vulnerability * The list of metrics was gathered on the basis of best practice analysis, Intetics original research and expertise acquired during multiple projects.

  17. PERFORMANCE ASSESSMENT

  18. Performance Assessment Performance – product evaluation based on customer requirements done by load testing and virtual users’ activity simulation.

  19. Load Test Metrics The following metrics are measured (mostly for web applications): * The list of metrics was created on the basis of best practice analysis, Intetics original research and expertise acquired during multiple projects.

  20. BUSINESS LOGIC ASSESSMENT

  21. Business Logic Assessment Business logic assessment is evaluation of how well a software product automates the intended business processes based on feedback of different focus groups like end-users, management, customers, community, etc. The evaluation is based on the six main metrics: • Effectiveness • Quality of product • Data safety • Simplicity • Business Rules and Policy • Competitiveness * The list of metrics was created on the basis of best practice analysis, Intetics original research and expertise acquired during multiple projects.

  22. Result Summary

  23. ARCHITECTURE AND DATA MODELASSESSMENT

  24. Architecture Quality Architecture quality is an expert assessment of product software structure and data model Good architecture includes: • Good maintainability and extensibility • Fast implementation of change requests • ‘Poka-yoke’ (inadvertent error prevention) • Easy knowledge transfer • Adequate performance

  25. Structure of Assessment • Security Authentication Authorization • Design Patterns • Services Transactions Caching Logging Infrastructure Validation Workflow • Exception handling • Components Infrastructure Recycling • Layers Data Business Presentation • SOLID Service Interface Coupling and Cohesion * The list of metrics was created on the basis of best practice analysis, Intetics original research and expertise acquired during multiple projects.

  26. Result Summary

  27. QUALITY OF DATAASSESSMENT

  28. Quality of Data Qualityof Data isassessment of product resistance to bad data, exception handling, and bad data prevention. Assessment is based on seven main metrics  • Accuracy • Consistency • Validity • Currency • Completeness • Timeliness • Accessibility * The list of metrics was created on the basis of best practice analysis, Intetics original research and expertise acquired during multiple projects.

  29. Result Summary General Quality of Data GOOD Accuracy 97% Completeness 98% Consistency 96% Timeliness 90% Validity 97% Accessibility 95% Currency 91% Life (Production) Data Quality 93% Data Structure 91% GOOD GOOD

  30. OPEN SOURCE CODEASSESSMENT

  31. Open Source Code Assessment Detects inclusions of open source components to the product and helps to mitigate copyright, security, legal, and operational risks that can come out of such use. Report Outcomes • List of open source code in the product • Licenses and copyrights • Open source software version analysis • Vulnerabilities and risks

  32. Tools Used in the Assessment Process

  33. Applications Architecture (Oracle) Integrating and Automating Next Generation Business Processes • MSDN Guidelines The key guidelines for designing the business layer of an application • W3CThe main international standards organization for the World Wide Web • OWASPOpen Web Application Security Project • ISO/IEC/IEEE 42010 Systems and software engineering. Recommended Practice for Architectural Description of Software-intensive Systems • ISO/IEC 25010:2011Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — System and software quality models • ISO/IEC 25012:2008Software engineering — Software product Quality Requirements and Evaluation (SQuaRE) — Data quality model • ISO 9241-11:1998Ergonomic requirements for office work with visual display terminals (VDTs) — Part 11: Guidance on usability • ISO 7498-2:1989Information processing systems — Open Systems Interconnection — Basic Reference Model — Part 2: Security Architecture • ISO/IEC 14756:1999Information technology -- Measurement and rating of performance of computer-based software systems • ISO 8000-130:2016Data quality — Part 130: Master data: Exchange of characteristic data: Accuracy • ISO 8000-140:2016Data quality — Part 140: Master data: Exchange of characteristic data: Completeness • ISO/IEC 27001Information security management • ISO/IEC 26514:2008 Systems and software engineering – Requirements for designers and developers of user documentation • ISO/IEC 9126-4:(2001)Software Engineering – Product Quality – Part 4: Quality in Use Metrics • ISO/IEC 25062:2006 Software engineering -- Software product Quality Requirements and Evaluation (SQuaRE) -- Common Industry Format (CIF) for usability test reports • ISO/IEC 25064:2013 Systems and software engineering -- Software product Quality Requirements and Evaluation (SQuaRE) -- Common Industry Format (CIF) for usability: User needs report Knowledge Base

  34. TETRA perfectly works for Everyone For Users.Is it good? Can we use it safely? For Clients. Does our product works as expected? Should we rework it? For Investors. Is the product beneficial? Should we invest? For Managers. Give me the numbers!! Are we going up or down? • For Developers. Is my code “good”? • How can I improve it?

  35. Whyuse TETRA to assess software products? • Source code quality analysis is automated • Usability evaluated • Security empowered • Overall performance estimated • Business logic evaluated • Solution architecture assessed • Resistance to bad data enabled • Open-source software detected

  36. Main Benefits of TETRA • Technical debts payed before turning into pains • Elaboration and support costs reduced • Progress evaluation adjusted • Business efficiency predicted • Comprehensive quality analysis performed • Detailed product key features analysis held • Compliance check conducted • Improvement recommendations created

  37. Thank You! Intetics Inc. 10001 Tamiami Trl N, Suite 114 Naples, Florida 34108 United States www.intetics.com odt@intetics.com Office: +1-239-217-4907

More Related