Software Productivity, Quality and Usability: Measurement, Prediction and Improvements

Overview

PAMPA is Project Attribute Monitoring and Prediction Associate, a powerful on-line tool for gathering data and measuring, predicting, and tracking the objects, attributes, and relationships at the heart of software development. With PAMPA you can increase customer satisfaction, improve productivity, reduce defects, and decrease costs. The key is visualization. PAMPA gives form to the concepts and metrics that many developers have considered not merely invisible, but unvisualizable. Using the techniques explained ...
See more details below
Available through our Marketplace sellers.
Other sellers (Hardcover)
  • All (9) from $1.99   
  • New (3) from $58.77   
  • Used (6) from $1.99   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$58.77
Seller since 2008

Feedback rating:

(218)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New

Ships from: Chicago, IL

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
$60.00
Seller since 2015

Feedback rating:

(241)

Condition: New
Brand new.

Ships from: acton, MA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
$74.50
Seller since 2015

Feedback rating:

(366)

Condition: New
Brand New Item.

Ships from: Chatham, NJ

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
Page 1 of 1
Showing All
Close
Sort by
Sending request ...

Overview

PAMPA is Project Attribute Monitoring and Prediction Associate, a powerful on-line tool for gathering data and measuring, predicting, and tracking the objects, attributes, and relationships at the heart of software development. With PAMPA you can increase customer satisfaction, improve productivity, reduce defects, and decrease costs. The key is visualization. PAMPA gives form to the concepts and metrics that many developers have considered not merely invisible, but unvisualizable. Using the techniques explained in Software Measurement, you can use PAMPA to gain control over the parallel cycles of project control and process improvement. Software Measurement will be a valuable asset for software developers, team leaders, and project managers, as well as students of software engineering, and anyone involved in software metrics and process improvement.

Thisis a software development publication that focuses on metric driven process improvement and data visualization with the use of the Project Attribute Monitoring and Prediction Associate (PAMPA), a data gathering and visualization tool. This visualization tool is not dependent an any specific type of software project, but uses object classes for "...describing an arbitrary software development project." The authors discuss concepts and issues of software process visualization, quality standards and systems, project failures and successes due to data visualization. The authors then define and describe object classes and discuss the software life cycle. The publication emphasizes metrics and models, with specific descriptions of metric components, code rework and reuse. The models contain a very interesting predictive feature, they use new and recycled code to predict the amount of new code for any arbitrary software project. The models themselves range from 'single cost drivers,' to complex and composite, and involve project time estimation and development schedule compression. Effort and volume models also allow you to predict productivity. The models are further expanded to include efficiency, work breakdown structures, communications and number of team members to measure effects on productivity. The authors also examine well-known stochastic and non-stochastic models of known reliability, then discuss reliability, software validation and usability issues.

Read More Show Less

Editorial Reviews

Booknews
The CD-ROM contains PAMPA (Project Attribute Monitoring and Prediction Associate), a tool for gathering data and measuring, predicting, and tracking the objects, attributes, and relationships at the heart of software development. Following an overview of software process visualization, the text applies quality systems criteria and standards to the concepts of life cycle processes and project object classes. The discussion goes on to specific models and metrics, and then verification, validation, and testing techniques. Annotation c. by Book News, Inc., Portland, Or.
Read More Show Less

Product Details

  • ISBN-13: 9780138406950
  • Publisher: Prentice Hall Professional Technical Reference
  • Publication date: 1/28/1997
  • Edition description: BK&CD-ROM
  • Edition number: 1
  • Pages: 384
  • Product dimensions: 7.28 (w) x 9.55 (h) x 1.53 (d)

Read an Excerpt

Preface: Preface

"Software, like wind, is invisible yet powerful. We can 'see' wind only by observing its effect, as it sways branches and swirls sand. And we can 'see' software only by observing its functionality. Invisibility is what makes software development so difficult to manage-it is hard to monitor the construction of something you can't see." Pei Hsia Hsia96.

Indeed, software is difficult to visualize. Managers typically rely on staff to describe the status of a software product. Software development projects often fail for lack of knowledge about the software being developed. Experts agree that software is hard to visualize. Fred Brooks, who wrote the article stating there is no silver bullet that by itself promises even one order-of-magnitude improvement in productivity, in reliability, in simplicity, says that invisibility is an inherent software property and that software is invisible and unvisualizable Brooks87. In our opinion, Brooks is overly pessimistic.

The purpose of this book is to show the reader how to visualize software which experts like Brooks have said is unvisualizable. We not only show that software attributes can be visualized but that they can be unobtrusively measured and the measurements can be used to drive volume, complexity, rework, efficiency, effort, productivity, schedule, reliability, reuse, speedup, and usability prediction models.

Over the past 35 years, administrators have spent large sums of money producing documents to describe a software product. Large government projects may require contractors to produce from 30 to 50 separate documents for each software product. With all these documents available,it is clear that people may still not know the status of a project. Many companies still use document-driven development to produce software that is over budget, behind schedule and then deliver it to a dissatisfied customer.

In 1995, Capers Jones stated that the failure or cancellation rate of large software systems is over 20% Jones95. Of the 80% that are completed, approximately two thirds are late and experience cost overruns as much as 100%. Roughly two thirds are also plagued by low reliability and quality problems in the first year of development.

During the 1990s, forward-looking companies have turned away from documentation driven development and turned toward metric driven process improvement methodology. While some documentation is necessary, software measurement can replace documents that really do not help you visualize software.

In Chapter 1 we cite examples of successful software development projects from AT&T Bell Laboratories, AT&T Network Systems, Boeing, Bull, CSC, DEC, IBM Santa Teresa Laboratory, Microsoft, NASA, Raytheon, Toshiba, and University of Maryland.
Process improvement activities at these organizations have resulted in annual defect reduction of up to 32%, cost reduction of up to 8.4%, productivity gains of up to 32% and customer satisfaction gains of up to 7.8%.
All of these successful projects apply some form of software measurement as part of their process improvement activities. But many other major organizations still do not have process improvement activities and many of these have ended up with failed projects. Often the management of these software projects were not aware that there were any problems until it was too late to salvage the project or to prevent cost over runs, extended outages, or major delays in scheduled delivery. Management was not able to see that problems were developing.

In 1996, Norm Brown, Executive Director of the Department of Defense Software Acquisition Best Practices Initiative, stated, "In many cases, the true nature and pervasive extent of underlying project problems remains invisible to the project management until it is too late"Brown96. Metric based software project management helps you dynamically track software product development so you can apply management control early in a project cycle where adjustments can be effective.

This book has been written for practicing software developers, team leaders, project managers, top administrators and others interested in software measurement, project control, and process improvement. Also, it can serve as a reference text for advanced undergraduate or graduate classes in software engineering and as the main text for a course in software metrics, models, and process improvement.

The book is divided into three parts: Software Process Visualization, Models and Metrics, and Visualization Tool.
Part 1 on software process visualization contains five chapters.
Chapter 1 shows how software has become the high cost item for most computer projects. We explain how management has difficulty visualizing software during development and maintenance.

Examples are presented where industry has learned how to improve the software development process by applying software improvement methodology. Also, major failures are described where they were not able to visualize the software development process. Project technologies deployed in successful software projects and the Department of Defense PRINCIPAL BEST PRACTICES are introduced to show how they contain planning based on accurate measurements to improve project visibility.

Chapter 2 examines the criteria and standards for quality systems.

Chapter 3 introduces a set of project object classes for describing an arbitrary software development project. A view of the project world is introduced that uses objects, relationships, attributes, and properties. The objects are defined to be easily understandable by both managers and software developers. Once the object world is presented, the project personnel can use it to view the effect of changes in resources, schedules, or software product features from historical, status, compliance to plans, and prediction perspectives. We also introduce a dual project control/process improvement cycle where common visualization stages help you efficiently and objectively gather object attributes for analysis and prediction.

In Chapter 4, we partition the evolution of the software life cycle (SLC) into three time periods: Early SLC, Black Box SLC, and Process SLC. We start with simple SLCs for small projects and proceed to complex SLCs for large software products. For the Black Box SLC period, we examine waterfall, V, prototype, and incremental SLCs. For the Process SLC period we examine spiral and the natural milestone SLCs. We also describe the IEEE SLC Process Model, which contains development, management, and integral processes.

In Chapter 5 we describe the object classes that can be used to portray a software project. You can use the objects to see exactly what is happening in a software development project. All activities of SLC processes can be made visible to you. After we describe the object classes, we give a few examples of the object attributes. Detailed descriptions of all software project object classes, attributes, relationships, and properties are presented in Appendix A.

Part 2 on models and metrics, contains nine chapters. Chapter 6 describes size metrics, including volume, structure, rework, and reuse. Models that use amount of new code and recycled code to predict equivalent amount of new code are included.

Chapter 7 describes effort prediction models that range from models based on a single cost driver to complex composite models based on many cost drivers. Cost dominators are introduced which are project attributes that can have a 10:1 affect on project cost. These dominators often cause projects to fail.

Chapter 8 explains how you can estimate development time. Development schedule compression is examined from a team viewpoint and then from the overall project perspective.

Chapter 9 shows how to predict productivity based on effort and volume models. Eighteen cost drivers are examined to see their affect on productivity. Efficiency and speedup up prediction models are introduced to show how communications, work breakdown structure, and number of team members affect productivity.

In Chapter 10, we examine the many factors that contribute to overall software product quality. We then show that software reliability and usability are the main contributors to overall software product quality.

Chapter 11 presents a review of some well-known reliability models, both stochastic and nonstochastic (static) models, to pave the way for the future development and evaluation of highly reliable software and of systems involving software and hardware. Important issues like software life-cycle costs, software safety, and the necessity of documents are also addressed.

Chapter 12 explains how to validate that the correct features have been built into a software product and to verify that the design is properly implemented. In this chapter, we cover the software test process, test categorization, test management systems, test tools, defect management, and test process measurement.

Chapter 13, on usability, addresses the problem of user frustration because they have difficulty operating software. We first define usability, explain how to design for usability, usability models, and summarize recent studies in usability.

Chapter 14 shows how to test for usability. We explain the purpose of testing; we then describe usability test variables, scenarios, and procedures.

Part 3, on visualization tool, contains a single chapter describing the Project Attribute Monitoring and Predicting Associate (PAMPA) tool. The PAMPA tool is being developed to unobtrusively gather metrics from any software development project. The information is preserved as objects with attributes, relationships, and properties. The objects are recognizable and understandable by a typical software developer or manager. PAMPA analyzes the information and presents it in a variety of formats including management reports, 2-dimensional graphs, 3-dimensional graphs, radar graphs, histograms, and Pareto diagrams. Intelligent agents can be developed to monitor projects and alert management of anomalies. An initial version of the PAMPA tool is included on a compact disk (CD) in the cover of this book. With it, after you transfer your files to your workstation, you can gather project information from any software development project and then save it in an understandable object/attribute format.

You can then view the projects using an inexpensive workstation that runs Microsoft Windows 95 or NT and Office 97. Included is an Object Editor to view detailed attributes and relationships of project objects. Also, there is a Metric Plot Generator that allows you to view 2-dimensional graphs and bar charts, 3-dimensional charts, and radar charts. Sorting features are available that allows you to create a Pareto chart from bar charts. This simple version of PAMPA gathers information on a software product written in C or C++ program languages. Later versions will gather information for any arbitrary language that is developed in an arbitrary development environment. Also they will gather information from defect manage systems, feature tracking systems, planning systems, suppliers, and customers. A PAMPA Users Manual is included in Appendix B. The manual contains information for installing and configuring PAMPA, a PAMPA tutorial, an Object Editor reference section and a Metric Plot Generator section.

The authors gratefully acknowledge the contributions of numerous individuals whose assistance and support were invaluable to the development of this book and the software that is included with the book. Jamileth Holtfrerich made an indispensable contribution by coordinating tasks and organizing all draft versions of the book. Hewlett Packard (HP) provided financial support to the Software Process Improvement Laboratory at Texas A&M University where the PAMPA tool was developed. Art Lane, the HP representative to the Computer Science Department Development Council Committee, was a key individual in helping to establishing a working relationship between HP and the Software Process Improvement Laboratory at Texas A&M University.

We would like to thank the following managers at HP for their support: Von Hansen, Bob Deely, Don Wadley, Gary Johnston, Mark Brown, Ming-Zen Kuo, and Tommy Mouser.

There were many students who directly and indirectly worked on the many projects within the Software Process Improvement Laboratory. While we cannot recognize every student, we would like to thank the following students for their contribution to development of the PAMPA tool: David Aldridge, John Burton, Chris Chapman, Travis Chow, Clayton Daigle, Mark Fleming, Mario Garcia, Gunawan, Mark Hashimoto, Jason Jaynes, Doug Keegan, Gunadi Lauw, Jeremy Mayhew, Steve Mazzucco, Ryan Moran, Anh Nguyen, David Quick, Balaji Rathakrishnan, Saravjit Rihal, Michael Schmidlkofer, Jeffery Sharp, Linda Thai, Jason Thompson, Glen Weinecke, and Matthew Wilson.

REFERENCES
Hsia96 Hsia, P., "Making Software Development Visible," IEEE Software, March 1996, pp. 23-25.
Jones95 Jones, C., "Patterns of large software systems: Failure and success," IEEE Computer, March 1995, pp. 86-87.
Brooks87 Brooks, Jr., F. P., "No silver bullet essence and accidents of software engineering," IEEE Computer, April 1987, pp. 10-19.
Brown96 Brown, N., "Industrial-Strength Management Strategies," IEEE Software, July 1996, pp. 94-103.

Read More Show Less

Table of Contents

Preface
Pt. 1 Software Process Visualization 1
1 Introduction 3
2 Quality Systems Criteria and Standards 21
3 Project Visualization 47
4 Life Cycle Processes 65
5 Project Object Classes 105
Pt. 2 Models and Metrics 121
6 Size 123
7 Effort 163
8 Development Time 199
9 Productivity 213
10 Quality 250
11 Reliability 266
12 Verification and Validation Testing 289
13 Usability 304
14 Usability Testing 322
Pt. 3 Visualization Tool 343
15 Project Attribute Monitoring and Prediction Associate (PAMPA) 345
App. A PAMPA Object Classes 359
App. B PAMPA Users Manual 383
Read More Show Less

Preface

Preface: Preface

"Software, like wind, is invisible yet powerful. We can 'see' wind only by observing its effect, as it sways branches and swirls sand. And we can 'see' software only by observing its functionality. Invisibility is what makes software development so difficult to manage-it is hard to monitor the construction of something you can't see." Pei Hsia Hsia96.

Indeed, software is difficult to visualize. Managers typically rely on staff to describe the status of a software product. Software development projects often fail for lack of knowledge about the software being developed. Experts agree that software is hard to visualize. Fred Brooks, who wrote the article stating there is no silver bullet that by itself promises even one order-of-magnitude improvement in productivity, in reliability, in simplicity, says that invisibility is an inherent software property and that software is invisible and unvisualizable Brooks87. In our opinion, Brooks is overly pessimistic.

The purpose of this book is to show the reader how to visualize software which experts like Brooks have said is unvisualizable. We not only show that software attributes can be visualized but that they can be unobtrusively measured and the measurements can be used to drive volume, complexity, rework, efficiency, effort, productivity, schedule, reliability, reuse, speedup, and usability prediction models.

Over the past 35 years, administrators have spent large sums of money producing documents to describe a software product. Large government projects may require contractors to produce from 30 to 50 separate documents for each software product. With all these documentsavailable,it is clear that people may still not know the status of a project. Many companies still use document-driven development to produce software that is over budget, behind schedule and then deliver it to a dissatisfied customer.

In 1995, Capers Jones stated that the failure or cancellation rate of large software systems is over 20% Jones95. Of the 80% that are completed, approximately two thirds are late and experience cost overruns as much as 100%. Roughly two thirds are also plagued by low reliability and quality problems in the first year of development.

During the 1990s, forward-looking companies have turned away from documentation driven development and turned toward metric driven process improvement methodology. While some documentation is necessary, software measurement can replace documents that really do not help you visualize software.

In Chapter 1 we cite examples of successful software development projects from AT&T Bell Laboratories, AT&T Network Systems, Boeing, Bull, CSC, DEC, IBM Santa Teresa Laboratory, Microsoft, NASA, Raytheon, Toshiba, and University of Maryland.
Process improvement activities at these organizations have resulted in annual defect reduction of up to 32%, cost reduction of up to 8.4%, productivity gains of up to 32% and customer satisfaction gains of up to 7.8%.
All of these successful projects apply some form of software measurement as part of their process improvement activities. But many other major organizations still do not have process improvement activities and many of these have ended up with failed projects. Often the management of these software projects were not aware that there were any problems until it was too late to salvage the project or to prevent cost over runs, extended outages, or major delays in scheduled delivery. Management was not able to see that problems were developing.

In 1996, Norm Brown, Executive Director of the Department of Defense Software Acquisition Best Practices Initiative, stated, "In many cases, the true nature and pervasive extent of underlying project problems remains invisible to the project management until it is too late"Brown96. Metric based software project management helps you dynamically track software product development so you can apply management control early in a project cycle where adjustments can be effective.

This book has been written for practicing software developers, team leaders, project managers, top administrators and others interested in software measurement, project control, and process improvement. Also, it can serve as a reference text for advanced undergraduate or graduate classes in software engineering and as the main text for a course in software metrics, models, and process improvement.

The book is divided into three parts: Software Process Visualization, Models and Metrics, and Visualization Tool.
Part 1 on software process visualization contains five chapters.
Chapter 1 shows how software has become the high cost item for most computer projects. We explain how management has difficulty visualizing software during development and maintenance.

Examples are presented where industry has learned how to improve the software development process by applying software improvement methodology. Also, major failures are described where they were not able to visualize the software development process. Project technologies deployed in successful software projects and the Department of Defense PRINCIPAL BEST PRACTICES are introduced to show how they contain planning based on accurate measurements to improve project visibility.

Chapter 2 examines the criteria and standards for quality systems.

Chapter 3 introduces a set of project object classes for describing an arbitrary software development project. A view of the project world is introduced that uses objects, relationships, attributes, and properties. The objects are defined to be easily understandable by both managers and software developers. Once the object world is presented, the project personnel can use it to view the effect of changes in resources, schedules, or software product features from historical, status, compliance to plans, and prediction perspectives. We also introduce a dual project control/process improvement cycle where common visualization stages help you efficiently and objectively gather object attributes for analysis and prediction.

In Chapter 4, we partition the evolution of the software life cycle (SLC) into three time periods: Early SLC, Black Box SLC, and Process SLC. We start with simple SLCs for small projects and proceed to complex SLCs for large software products. For the Black Box SLC period, we examine waterfall, V, prototype, and incremental SLCs. For the Process SLC period we examine spiral and the natural milestone SLCs. We also describe the IEEE SLC Process Model, which contains development, management, and integral processes.

In Chapter 5 we describe the object classes that can be used to portray a software project. You can use the objects to see exactly what is happening in a software development project. All activities of SLC processes can be made visible to you. After we describe the object classes, we give a few examples of the object attributes. Detailed descriptions of all software project object classes, attributes, relationships, and properties are presented in Appendix A.

Part 2 on models and metrics, contains nine chapters. Chapter 6 describes size metrics, including volume, structure, rework, and reuse. Models that use amount of new code and recycled code to predict equivalent amount of new code are included.

Chapter 7 describes effort prediction models that range from models based on a single cost driver to complex composite models based on many cost drivers. Cost dominators are introduced which are project attributes that can have a 10:1 affect on project cost. These dominators often cause projects to fail.

Chapter 8 explains how you can estimate development time. Development schedule compression is examined from a team viewpoint and then from the overall project perspective.

Chapter 9 shows how to predict productivity based on effort and volume models. Eighteen cost drivers are examined to see their affect on productivity. Efficiency and speedup up prediction models are introduced to show how communications, work breakdown structure, and number of team members affect productivity.

In Chapter 10, we examine the many factors that contribute to overall software product quality. We then show that software reliability and usability are the main contributors to overall software product quality.

Chapter 11 presents a review of some well-known reliability models, both stochastic and nonstochastic (static) models, to pave the way for the future development and evaluation of highly reliable software and of systems involving software and hardware. Important issues like software life-cycle costs, software safety, and the necessity of documents are also addressed.

Chapter 12 explains how to validate that the correct features have been built into a software product and to verify that the design is properly implemented. In this chapter, we cover the software test process, test categorization, test management systems, test tools, defect management, and test process measurement.

Chapter 13, on usability, addresses the problem of user frustration because they have difficulty operating software. We first define usability, explain how to design for usability, usability models, and summarize recent studies in usability.

Chapter 14 shows how to test for usability. We explain the purpose of testing; we then describe usability test variables, scenarios, and procedures.

Part 3, on visualization tool, contains a single chapter describing the Project Attribute Monitoring and Predicting Associate (PAMPA) tool. The PAMPA tool is being developed to unobtrusively gather metrics from any software development project. The information is preserved as objects with attributes, relationships, and properties. The objects are recognizable and understandable by a typical software developer or manager. PAMPA analyzes the information and presents it in a variety of formats including management reports, 2-dimensional graphs, 3-dimensional graphs, radar graphs, histograms, and Pareto diagrams. Intelligent agents can be developed to monitor projects and alert management of anomalies. An initial version of the PAMPA tool is included on a compact disk (CD) in the cover of this book. With it, after you transfer your files to your workstation, you can gather project information from any software development project and then save it in an understandable object/attribute format.

You can then view the projects using an inexpensive workstation that runs Microsoft Windows 95 or NT and Office 97. Included is an Object Editor to view detailed attributes and relationships of project objects. Also, there is a Metric Plot Generator that allows you to view 2-dimensional graphs and bar charts, 3-dimensional charts, and radar charts. Sorting features are available that allows you to create a Pareto chart from bar charts. This simple version of PAMPA gathers information on a software product written in C or C++ program languages. Later versions will gather information for any arbitrary language that is developed in an arbitrary development environment. Also they will gather information from defect manage systems, feature tracking systems, planning systems, suppliers, and customers. A PAMPA Users Manual is included in Appendix B. The manual contains information for installing and configuring PAMPA, a PAMPA tutorial, an Object Editor reference section and a Metric Plot Generator section.

The authors gratefully acknowledge the contributions of numerous individuals whose assistance and support were invaluable to the development of this book and the software that is included with the book. Jamileth Holtfrerich made an indispensable contribution by coordinating tasks and organizing all draft versions of the book. Hewlett Packard (HP) provided financial support to the Software Process Improvement Laboratory at Texas A&M University where the PAMPA tool was developed. Art Lane, the HP representative to the Computer Science Department Development Council Committee, was a key individual in helping to establishing a working relationship between HP and the Software Process Improvement Laboratory at Texas A&M University.

We would like to thank the following managers at HP for their support: Von Hansen, Bob Deely, Don Wadley, Gary Johnston, Mark Brown, Ming-Zen Kuo, and Tommy Mouser.

There were many students who directly and indirectly worked on the many projects within the Software Process Improvement Laboratory. While we cannot recognize every student, we would like to thank the following students for their contribution to development of the PAMPA tool: David Aldridge, John Burton, Chris Chapman, Travis Chow, Clayton Daigle, Mark Fleming, Mario Garcia, Gunawan, Mark Hashimoto, Jason Jaynes, Doug Keegan, Gunadi Lauw, Jeremy Mayhew, Steve Mazzucco, Ryan Moran, Anh Nguyen, David Quick, Balaji Rathakrishnan, Saravjit Rihal, Michael Schmidlkofer, Jeffery Sharp, Linda Thai, Jason Thompson, Glen Weinecke, and Matthew Wilson.

REFERENCES
Hsia96 Hsia, P., "Making Software Development Visible," IEEE Software, March 1996, pp. 23-25.
Jones95 Jones, C., "Patterns of large software systems: Failure and success," IEEE Computer, March 1995, pp. 86-87.
Brooks87 Brooks, Jr., F. P., "No silver bullet essence and accidents of software engineering," IEEE Computer, April 1987, pp. 10-19.
Brown96 Brown, N., "Industrial-Strength Management Strategies," IEEE Software, July 1996, pp. 94-103.

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)