- Shopping Bag ( 0 items )
Ships from: Northbrook, IL
Usually ships in 1-2 business days
Ships from: fallbrook, CA
Usually ships in 1-2 business days
Global competition, the time sensitivity of the new Internet economy, and increasing customer demand for better software quality are pushing companies to undertake software process improvement (SPI) initiatives. Numerous software organizations worldwide have implemented these initiatives with varying degrees of success. Many have adhered to standard SPI practice, only to experience less-than-satisfactory results when the execution proves more difficult than expected and enthusiasm and resources wane.
Improving Software Organizations offers a modern perspective on SPI. It outlines and discusses what it takes to move from SPI theory to successful SPI initiatives. Based on the results of the three-year National Danish SPI Initiative, this book draws on the experiences of four world-class companies—Danske Data, Brüel & Kjær, Ericsson Denmark, and Systematic Software Engineering. It presents a proven roadmap for successful SPI. It distills in-depth studies of these organizations—the strategies, approaches, and specific techniques that yielded tangible results. It presents a comprehensive framework for planning and executing successful SPI projects throughout the project lifecycle.
Improving Software Organizations presents the major lessons learned in the four companies. It provides an overview of the theories and models that formed the basis of the SPI initiatives. It also provides an in-depth examination of the four companies¿ development organizations, how each began the SPI initiative, what mistakes were made, and how they ultimately succeeded.
You will learn:
For each of the five SPI principles, the book offers examples from practice that demonstrate how successful organizations approached the issue. From these examples and the more detailed case studies, you will gain the understanding of how to design, implement, and execute an SPI initiative that is right for your organization.
I. LEARNING TO IMPROVE.
1. Learning SPI in Practice.
2. Mapping SPI Ideas and Practices.
II. LEARNING FROM EXPERIENCE.
3. The Correct Effort.
4. The Ambitious Effort.
5. The Grassroots Effort.
6. The Adolescent Effort.
III. INITIATING LEARNING.
7. Learning from Assessments.
8. From Problem Reports to Better Products.
9. Problem Diagnosis in SPI.
10. Project Assessments.
11. A Framework for Selecting an Assessment Strategy.
IV. ORGANIZING FOR LEARNING.
12. Knowing and Implementing SPI.
13. Improving Customer Relations.
14. Strategies for Organizational Learning in SPI.
V. TECHNIQUES FOR LEARNING TO IMPROVE.
15. Implementing SPI: An Organizational Approach.
16. Risk Management in Process Action Teams.
17. Principles of Metrics Implementation.
18. Better Requirements.
Appendix A: Risk and Action Tables.
In this book, Improving Software Organizations, we discuss ways to understand and develop the core competencies required to succeed with SPI. Our approach is pragmatic and action-oriented. We examine SPI experiences from real-world situations and distill from them essential lessons for planning, implementing, and managing SPI initiatives to successful completion.
Our book is a result of a collaboration between four Danish companies—Danske Data, Brüel & Kjær, Ericsson Denmark, and Systematic Software Engineering—three universities—Aalborg University, Copenhagen Business School, and Technical University, Denmark—and an R&D organization, Delta. The project was part of the Danish National SPI Initiative and lasted from January 1997 to December 1999. It was funded in part by the government of Denmark through the Danish National Center for IT Research. During the three-year project, scientists and engineers from the companies and universities worked together on SPI projects within the companies. A primary objective of our collaboration was not only to successfully implement SPI in the companies but also to develop principles and strategies for effectively executing SPI initiatives. From the beginning, we set out to examine and develop solutions for difficult practical problems reported by other SPI experts. In these pages, we present our findings and reflections based on our experiences practicing SPI. We hope that you find our book informative and that the information in it supports your own efforts to solve the practical problems involved with planning and implementing your own SPI programs.
Brüel & Kjær’s main office is in Nærum (just north of Copenhagen), and the company operates more than 50 sales offices and agencies worldwide. In 1998, Brüel & Kjær was divided into two separate companies:
Sound and Vibration is the larger of the two companies, with 550 employees. Approximately 80 of these employees are development engineers, of whom 40 are software developers. Annually, 10 to 15 development projects are carried out, with 4 to 8 people in each project group. Condition Monitoring Systems has some 50 employees, of whom 10 are software developers. Over the past 10 to 15 years, Brüel & Kjær has been transformed from a company focused on hardware, mechanics, and electronics to a company focused on software. Today, two out of three engineers at Brüel & Kjær are software engineers. Most Brüel & Kjær employees have an engineering education; a few have backgrounds in business or computer science.
In the mid-1990s, Brüel & Kjær transformed itself from a departmental organization to a project-oriented organization. As part of this process, the entire middle management layer was replaced. Several other employees were trained in project management and given responsibility for managing development projects in the new organization. During the 1990s, Brüel & Kjær carried out several other organizational change initiatives. In 1994, the company successfully completed ISO 9000 certification.
When assessed in October 1996, Brüel & Kjær was measured at level 2.25 on the Bootstrap scale. It was the only one of the four collaborating companies that started the SPI project at maturity level 2. In the fall of 1999, Brüel & Kjær was again assessed using the Bootstrap model, and the result showed an increase of maturity to 2.5.
Software development projects at Danske Data vary widely in size; most are small and short-term, but there are also some major projects that have strategic implications for the entire corporation. Project teams of 3 to 5 people typically handle the smaller projects, which usually take 6 to 12 months. Large projects, such as the Year 2000 compliance project, typically involve as many as 150 people and last 6 months to 3 years. Danske Data has four development divisions, each headed by a senior vice president. Each individual division is led by a vice president and organized into departments, typically with 20 to 50 people divided among five or so projects. Project managers oversee regular projects, and the vice president manages high-profile projects. Software developers at Danske Data typically have a bachelor’s degree in either an IT-related field or banking.
Danske Data develops software mainly for mainframe computers but also develops some applications for client/server environments, such as Internet banking. Danske Data mainframe applications run 24 hours a day and process a daily average of nine million transactions from about 11,000 workstations. The company’s mainframe installation is the largest in Northern Europe and is divided between two operation centers. Systems developed for this platform are based on an advanced event-oriented database principle, something that increases data processing flexibility. Security and reliability are the two main system requirements because data are mirrored in real time between the two operation centers in Århus and Copenhagen. Modern methods for modeling data, functions, and workflow are used along with the all-important business model—information framework—which is crucial to getting stakeholders from the user organization involved in the development process.
In May 1997, Danske Data conducted its first assessment of software process maturity. It used both the Capability Maturity Model (CMM) and Bootstrap assessment approaches, which showed the company to be right between level 1 and 2 (1.5 using the Bootstrap scale). Danske Data was again assessed in October 1999 and was at that point at level 2.0.
In early 1996, Ericsson Corporation changed its organizational structure from a line to a matrix organization. In the period following—from 1996 to 1998—Ericsson Denmark’s staff increased from 250 to 400, and each of its product groups reported to corresponding business units located in other countries. Both the Ericsson Corporation and Ericsson Denmark have a long history of improving software development. In 1992, the company took the first steps to set up a corporatewide SPI program, the Ericsson System Software Initiative (ESSI). From the beginning, ESSI was a strategic effort that ensured alignment, deployment, and follow-up on corporate SPI goals. ESSI’s first intervention was in Ericsson’s largest and most complex software development area, the telephone exchange software group. An aggressive goal was defined to reduce fault density in telephone exchange software products by 50% annually.
Another important ESSI initiative focused on CMM as a long-term strategy for improving software development performance. The initiative was supported by the creation of an international corps of trained CMM assessors tasked with determining the level of software process maturity throughout the company. At the end of 1996, the ESSI program had been operational worldwide for a couple of years, and most of the company’s international software development sites had shown good progress toward reaching the corporate fault density goals.
Ericsson Denmark was assessed at level 1 in 1995 and at level 2 in June 1998. In between the two assessments, the division underwent both Light Assessments and UltraLight Assessments.
In 1996, Systematic employed 137 people. Of these employees, 105 were software engineers and 32 worked in finance, administration, internal IT, quality assurance, canteen, and cleaning. By 1999, the number of employees had grown to 155. At Systematic, all software development takes place in project teams, led by a project manager. Most managers started with the company as software engineers and were later trained internally for management responsibilities. In 1998-99, project teams ranged in size from 2 to 18 members and projects lasted from two months to three years. Typically, project members were not rotated out; they stayed with the project from the analysis phase through requirements specification, design, programming, test, documentation, installation, and user training. This practice reflects the company’s belief that such consistency ensures maximum commitment and development of staff competence.
Despite the small number of graduates in computer science and systems engineering in Denmark, two-thirds of Systematic’s employees hold master’s or doctoral degrees. To facilitate high flexibility and preparedness for change, the company recruits highly educated people with knowledge of state-of-the-art technologies. One of the main reasons Systematic undertook SPI was to help meet its goal of becoming an internationally recognized software supplier and systems integrator in communications and interoperability between defense units, and in electronic commerce and data interchange between enterprises. In 1992, Systematic’s quality assurance system was certified in accordance with ISO 9001 and the military standards AQAP 110 and 150. The ISO 9001 certified quality management system is the basis of numerous elements in Systematic’s quality assurance procedures.
In 1997, Systematic conducted its first software process maturity assessment using both the CMM and Bootstrap approaches and was rated to be just under Bootstrap 2. In 1998 and 1999, the company conducted additional Bootstrap assessments, and in 1999 the company was assessed to be at level 2.5 (using the Bootstrap maturity scale).
Part III, Initiating Learning, focuses on how to structure learning conditions and initiate learning in SPI initiatives. We discuss maturity level assessments as an important mechanism for learning. We have used a broad range of assessment methods. Some were inspired by formalized approaches, such as CMM or Bootstrap (discussed in Chapters 7 and 10), whereas others were invented in project groups (Chapters 8 and 9). Finally, Chapter 11 discusses how to select an appropriate assessment strategy. Part IV, Organizing for Learning, goes beyond assessments and takes a more reflective look at SPI: In Chapter 12, we reflect on knowledge transfer; in Chapter 13, we discuss customer maturity; and in Chapter 14 we focus on organizational learning in the SPI context.
Part V examines interesting details in different techniques for SPI. Chapter 15 presents a framework for implementing SPI programs, and the remaining chapters offer detailed discussions of how to carry out risk assessments (Chapter 16), how to implement a metrics program (Chapter 17), and how to improve requirements specification (Chapter 18).
This book is based on a truly collaborative effort. The team of engineers and scientists that have authored the chapters is listed at the very end of this book. Three of the authors—Lars Mathiassen, Jan Pries-Heje, and Ojelanki Ngwenyama—have edited this book assisted by Keri Schreiner who has interacted closely with the authors to help them write for practitioners. Finally, the staff at Addison-Wesley has provided valuable support in designing and producing the book.
In those early days most of the advice we offered was based on the technology transfer literature. However, there were two problems with this advice. First, improving an organization's processes was not always like deploying a new technology. In fact the technology transfer literature was rife with examples of transfer failures caused when technologies required major changes to an organization's processes. Second, as we produced the CMM, we realized that introducing new technologies was a risky undertaking until an organization had reached at least level 3. In essence, the models that worked for technology transfer were primarily designed for organizations with defined processes. These were not the organizations calling us. What advice should we offer to organizations in the throes of level 1 adhocracy?
Fortunately, companies began reporting their software process improvement experiences over the next several years in journals and at conferences. Some people who had been in improvement groups began publishing books on their experiences and lessons learned. However, there was no source that integrated and compared software process improvement experiences from multiple companies. This book fills that void.
Improving Software Organizations: From Principles to Practice is one of the best books ever written on software process improvement. It describes real industrial experiences and admits to the problems that were experienced in implementing software process improvement and how they were addressed. Perhaps the greatest lesson in this book is that none of the reigning models of how to conduct improvement programs is sufficient to guarantee success. While most of the models seem academically proper, the action research reported here uncovers the very real limitations in their effectiveness. True to the tenets of good action research, the final chapters induce the lessons learned across this broad research program. Confidence in the generalizability of these lessons across companies is difficult to establish without the broad cooperation and support that was achieved in Denmark during the late 1990s.
With so many companies in so many nations spending so much money on software process improvement, why did an industry-leading book emerge from a country with a comparatively small population? First, because the country cared. The Danish government invested in learning how to increase the capability of its companies to compete in software development. It recognized that organized research and learning would serve the country's industry better than isolated reports delivered at foreign conferences.
Second, because management cared. The four companies that participated in producing the lessons presented in this book realized the critical role that software played in their businesses and that software process improvement was critical to their competitiveness. They believed that the rate of learning from comparing mutual experience exceeded the learning to be gleaned from their individual experiences. Management believed that the benefits gained from participating in pre-competitive research far exceeded the risks of sharing internal experiences, not all of which were positive.
Third, because academia cared. The action research tradition pioneered in Scandinavia is displayed at its most beneficial in this book. Danish researchers ventured beyond their laboratories and campuses to take the risk of applying their ideas in actual practice. This book stands as a testament to the national benefits that can accrue from energetic collaborations between government, industry, and academia. Does publishing these lessons reveal national secrets to countries that didn't pay for the research? Of course it does. But little competitive advantage will be lost, since these companies will be implementing a whole new round of improvements by the time you read this book.
This book is a critical resource for software organizations across the globe. It is important not only because of its valuable lessons, but also because it demonstrates the power of pre-competitive collaborative research. Research I wish I had had access to when I was responding to those phone calls in 1991.
Dr. Bill Curtis
Ft. Worth, Texas