Software Productivity, Quality and Usability: Measurement, Prediction and Improvements

Software Productivity, Quality and Usability: Measurement, Prediction and Improvements

by Dick B. Simmons, Dick B. Simmons, Way Kuo, Hiroko Fujihara
     
 
Visualize software project success with PAMPA!


PAMPA is Project Attribute Monitoring and Prediction Associate, a powerful on-line tool for gathering data and measuring, predicting, and tracking the objects, attributes and relationships at the heart of software development.


With PAMPA you can:


    Overview

    Visualize software project success with PAMPA!


    PAMPA is Project Attribute Monitoring and Prediction Associate, a powerful on-line tool for gathering data and measuring, predicting, and tracking the objects, attributes and relationships at the heart of software development.


    With PAMPA you can:



    • Increase customer satisfaction
    • Reduce defects
    • Improve productivity
    • Decrease costs


    The key is visualization. PAMPA gives form to the concepts and metrics that many developers have considered not merely invisible, but unvisualizable. Using the techniques explained in Software Measurement, you can use PAMPA to gain control over the parallel cycles of project control and process improvement.


    Project attributes can be displayed as:



    • Graphical trees
    • Tables
    • Radar charts
    • 2-D graphs
    • 3-D graphs


    Software Measurement begins with an overview of software process visualization. Quality systems criteria and standards are applied to the concepts of life cycle processes and project object classes.


    With this foundation, the discussion goes on to specific models and metrics, including:



    • Scale
    • Development Time
    • Productivity
    • Quality
    • Reliability
    • Usability


    Verification, validation, and testing techniques complete the picture. Appendices include a complete Users Manual for PAMPA and a guide to its object classes.


    Software Measurement will be a valuable asset for software developers, team leaders, and project managers, as well as students of software engineering, and anyone involved in software metrics and process improvement.


    The accompanying CD-ROM, for use on Windows' NT 3.5 or later and Windows 95, contains everything you need to put PAMPA to work on your next software project.


    0-13-840695-2

    Editorial Reviews

    Booknews
    The CD-ROM contains PAMPA (Project Attribute Monitoring and Prediction Associate), a tool for gathering data and measuring, predicting, and tracking the objects, attributes, and relationships at the heart of software development. Following an overview of software process visualization, the text applies quality systems criteria and standards to the concepts of life cycle processes and project object classes. The discussion goes on to specific models and metrics, and then verification, validation, and testing techniques. Annotation c. by Book News, Inc., Portland, Or.

    Product Details

    ISBN-13:
    9780138406950
    Publisher:
    Prentice Hall Professional Technical Reference
    Publication date:
    01/28/1997
    Edition description:
    BK&CD-ROM
    Pages:
    384
    Product dimensions:
    7.28(w) x 9.55(h) x 1.53(d)

    Read an Excerpt

    Preface: Preface

    "Software, like wind, is invisible yet powerful. We can 'see' wind only by observing its effect, as it sways branches and swirls sand. And we can 'see' software only by observing its functionality. Invisibility is what makes software development so difficult to manage-it is hard to monitor the construction of something you can't see." Pei Hsia Hsia96.

    Indeed, software is difficult to visualize. Managers typically rely on staff to describe the status of a software product. Software development projects often fail for lack of knowledge about the software being developed. Experts agree that software is hard to visualize. Fred Brooks, who wrote the article stating there is no silver bullet that by itself promises even one order-of-magnitude improvement in productivity, in reliability, in simplicity, says that invisibility is an inherent software property and that software is invisible and unvisualizable Brooks87. In our opinion, Brooks is overly pessimistic.

    The purpose of this book is to show the reader how to visualize software which experts like Brooks have said is unvisualizable. We not only show that software attributes can be visualized but that they can be unobtrusively measured and the measurements can be used to drive volume, complexity, rework, efficiency, effort, productivity, schedule, reliability, reuse, speedup, and usability prediction models.

    Over the past 35 years, administrators have spent large sums of money producing documents to describe a software product. Large government projects may require contractors to produce from 30 to 50 separate documents for each software product. With all these documents available,it is clear that people may still not know the status of a project. Many companies still use document-driven development to produce software that is over budget, behind schedule and then deliver it to a dissatisfied customer.

    In 1995, Capers Jones stated that the failure or cancellation rate of large software systems is over 20% Jones95. Of the 80% that are completed, approximately two thirds are late and experience cost overruns as much as 100%. Roughly two thirds are also plagued by low reliability and quality problems in the first year of development.

    During the 1990s, forward-looking companies have turned away from documentation driven development and turned toward metric driven process improvement methodology. While some documentation is necessary, software measurement can replace documents that really do not help you visualize software.

    In Chapter 1 we cite examples of successful software development projects from AT&T Bell Laboratories, AT&T Network Systems, Boeing, Bull, CSC, DEC, IBM Santa Teresa Laboratory, Microsoft, NASA, Raytheon, Toshiba, and University of Maryland.
    Process improvement activities at these organizations have resulted in annual defect reduction of up to 32%, cost reduction of up to 8.4%, productivity gains of up to 32% and customer satisfaction gains of up to 7.8%.
    All of these successful projects apply some form of software measurement as part of their process improvement activities. But many other major organizations still do not have process improvement activities and many of these have ended up with failed projects. Often the management of these software projects were not aware that there were any problems until it was too late to salvage the project or to prevent cost over runs, extended outages, or major delays in scheduled delivery. Management was not able to see that problems were developing.

    In 1996, Norm Brown, Executive Director of the Department of Defense Software Acquisition Best Practices Initiative, stated, "In many cases, the true nature and pervasive extent of underlying project problems remains invisible to the project management until it is too late"Brown96. Metric based software project management helps you dynamically track software product development so you can apply management control early in a project cycle where adjustments can be effective.

    This book has been written for practicing software developers, team leaders, project managers, top administrators and others interested in software measurement, project control, and process improvement. Also, it can serve as a reference text for advanced undergraduate or graduate classes in software engineering and as the main text for a course in software metrics, models, and process improvement.

    The book is divided into three parts: Software Process Visualization, Models and Metrics, and Visualization Tool.
    Part 1 on software process visualization contains five chapters.
    Chapter 1 shows how software has become the high cost item for most computer projects. We explain how management has difficulty visualizing software during development and maintenance.

    Examples are presented where industry has learned how to improve the software development process by applying software improvement methodology. Also, major failures are described where they were not able to visualize the software development process. Project technologies deployed in successful software projects and the Department of Defense PRINCIPAL BEST PRACTICES are introduced to show how they contain planning based on accurate measurements to improve project visibility.

    Chapter 2 examines the criteria and standards for quality systems.

    Chapter 3 introduces a set of project object classes for describing an arbitrary software development project. A view of the project world is introduced that uses objects, relationships, attributes, and properties. The objects are defined to be easily understandable by both managers and software developers. Once the object world is presented, the project personnel can use it to view the effect of changes in resources, schedules, or software product features from historical, status, compliance to plans, and prediction perspectives. We also introduce a dual project control/process improvement cycle where common visualization stages help you efficiently and objectively gather object attributes for analysis and prediction.

    In Chapter 4, we partition the evolution of the software life cycle (SLC) into three time periods: Early SLC, Black Box SLC, and Process SLC. We start with simple SLCs for small projects and proceed to complex SLCs for large software products. For the Black Box SLC period, we examine waterfall, V, prototype, and incremental SLCs. For the Process SLC period we examine spiral and the natural milestone SLCs. We also describe the IEEE SLC Process Model, which contains development, management, and integral processes.

    In Chapter 5 we describe the object classes that can be used to portray a software project. You can use the objects to see exactly what is happening in a software development project. All activities of SLC processes can be made visible to you. After we describe the object classes, we give a few examples of the object attributes. Detailed descriptions of all software project object classes, attributes, relationships, and properties are presented in Appendix A.

    Part 2 on models and metrics, contains nine chapters. Chapter 6 describes size metrics, including volume, structure, rework, and reuse. Models that use amount of new code and recycled code to predict equivalent amount of new code are included.

    Chapter 7 describes effort prediction models that range from models based on a single cost driver to complex composite models based on many cost drivers. Cost dominators are introduced which are project attributes that can have a 10:1 affect on project cost. These dominators often cause projects to fail.

    Chapter 8 explains how you can estimate development time. Development schedule compression is examined from a team viewpoint and then from the overall project perspective.

    Chapter 9 shows how to predict productivity based on effort and volume models. Eighteen cost drivers are examined to see their affect on productivity. Efficiency and speedup up prediction models are introduced to show how communications, work breakdown structure, and number of team members affect productivity.

    In Chapter 10, we examine the many factors that contribute to overall software product quality. We then show that software reliability and usability are the main contributors to overall software product quality.

    Chapter 11 presents a review of some well-known reliability models, both stochastic and nonstochastic (static) models, to pave the way for the future development and evaluation of highly reliable software and of systems involving software and hardware. Important issues like software life-cycle costs, software safety, and the necessity of documents are also addressed.

    Chapter 12 explains how to validate that the correct features have been built into a software product and to verify that the design is properly implemented. In this chapter, we cover the software test process, test categorization, test management systems, test tools, defect management, and test process measurement.

    Chapter 13, on usability, addresses the problem of user frustration because they have difficulty operating software. We first define usability, explain how to design for usability, usability models, and summarize recent studies in usability.

    Chapter 14 shows how to test for usability. We explain the purpose of testing; we then describe usability test variables, scenarios, and procedures.

    Part 3, on visualization tool, contains a single chapter describing the Project Attribute Monitoring and Predicting Associate (PAMPA) tool. The PAMPA tool is being developed to unobtrusively gather metrics from any software development project. The information is preserved as objects with attributes, relationships, and properties. The objects are recognizable and understandable by a typical software developer or manager. PAMPA analyzes the information and presents it in a variety of formats including management reports, 2-dimensional graphs, 3-dimensional graphs, radar graphs, histograms, and Pareto diagrams. Intelligent agents can be developed to monitor projects and alert management of anomalies. An initial version of the PAMPA tool is included on a compact disk (CD) in the cover of this book. With it, after you transfer your files to your workstation, you can gather project information from any software development project and then save it in an understandable object/attribute format.

    You can then view the projects using an inexpensive workstation that runs Microsoft Windows 95 or NT and Office 97. Included is an Object Editor to view detailed attributes and relationships of project objects. Also, there is a Metric Plot Generator that allows you to view 2-dimensional graphs and bar charts, 3-dimensional charts, and radar charts. Sorting features are available that allows you to create a Pareto chart from bar charts. This simple version of PAMPA gathers information on a software product written in C or C++ program languages. Later versions will gather information for any arbitrary language that is developed in an arbitrary development environment. Also they will gather information from defect manage systems, feature tracking systems, planning systems, suppliers, and customers. A PAMPA Users Manual is included in Appendix B. The manual contains information for installing and configuring PAMPA, a PAMPA tutorial, an Object Editor reference section and a Metric Plot Generator section.

    The authors gratefully acknowledge the contributions of numerous individuals whose assistance and support were invaluable to the development of this book and the software that is included with the book. Jamileth Holtfrerich made an indispensable contribution by coordinating tasks and organizing all draft versions of the book. Hewlett Packard (HP) provided financial support to the Software Process Improvement Laboratory at Texas A&M University where the PAMPA tool was developed. Art Lane, the HP representative to the Computer Science Department Development Council Committee, was a key individual in helping to establishing a working relationship between HP and the Software Process Improvement Laboratory at Texas A&M University.

    We would like to thank the following managers at HP for their support: Von Hansen, Bob Deely, Don Wadley, Gary Johnston, Mark Brown, Ming-Zen Kuo, and Tommy Mouser.

    There were many students who directly and indirectly worked on the many projects within the Software Process Improvement Laboratory. While we cannot recognize every student, we would like to thank the following students for their contribution to development of the PAMPA tool: David Aldridge, John Burton, Chris Chapman, Travis Chow, Clayton Daigle, Mark Fleming, Mario Garcia, Gunawan, Mark Hashimoto, Jason Jaynes, Doug Keegan, Gunadi Lauw, Jeremy Mayhew, Steve Mazzucco, Ryan Moran, Anh Nguyen, David Quick, Balaji Rathakrishnan, Saravjit Rihal, Michael Schmidlkofer, Jeffery Sharp, Linda Thai, Jason Thompson, Glen Weinecke, and Matthew Wilson.

    REFERENCES
    Hsia96 Hsia, P., "Making Software Development Visible," IEEE Software, March 1996, pp. 23-25.
    Jones95 Jones, C., "Patterns of large software systems: Failure and success," IEEE Computer, March 1995, pp. 86-87.
    Brooks87 Brooks, Jr., F. P., "No silver bullet essence and accidents of software engineering," IEEE Computer, April 1987, pp. 10-19.
    Brown96 Brown, N., "Industrial-Strength Management Strategies," IEEE Software, July 1996, pp. 94-103.

    Customer Reviews

    Average Review:

    Write a Review

    and post it to your social network

         

    Most Helpful Customer Reviews

    See all customer reviews >