Calibration and Validation of the SAGE Software Cost/Schedule Estimating System to United States Air Force Databases
This research entailed calibration and validation of the SAGE Software Cost/Schedule Estimating System, Version 1.7 as a means to improve estimating accuracy for DOD software-intensive systems, and thereby introduce stability into software system development. SAGE calibration consisted of using historical data from completed projects at the Space and Missile Systems Center (SMC) and the Electronic Systems Center (ESC) to derive average performance factors (i.e., calibration factors) for pre-defined categories of projects. A project was categorized for calibration by either its primary application or by the contractor that developed it. The intent was to determine the more appropriate categorization for calibration. SAGE validation consisted of using the derived calibration factors to predict completed efforts, not used in deriving the factors. Statistical resampling employing Monte Carlo simulation was used to calibrate and validate the model on each possible combination of a category's projects. Three statistical measures were employed to measure model performance in default and calibrated estimating modes. SAGE generally did not meet pre-established criteria for estimating accuracy, although the model demonstrated some improvement with calibration. Calibration of projects categorized by contractor resulted in better calibrated model performance than calibration of projects categorized by application. This categorization is suggested for future consideration.
1114616199
Calibration and Validation of the SAGE Software Cost/Schedule Estimating System to United States Air Force Databases
This research entailed calibration and validation of the SAGE Software Cost/Schedule Estimating System, Version 1.7 as a means to improve estimating accuracy for DOD software-intensive systems, and thereby introduce stability into software system development. SAGE calibration consisted of using historical data from completed projects at the Space and Missile Systems Center (SMC) and the Electronic Systems Center (ESC) to derive average performance factors (i.e., calibration factors) for pre-defined categories of projects. A project was categorized for calibration by either its primary application or by the contractor that developed it. The intent was to determine the more appropriate categorization for calibration. SAGE validation consisted of using the derived calibration factors to predict completed efforts, not used in deriving the factors. Statistical resampling employing Monte Carlo simulation was used to calibrate and validate the model on each possible combination of a category's projects. Three statistical measures were employed to measure model performance in default and calibrated estimating modes. SAGE generally did not meet pre-established criteria for estimating accuracy, although the model demonstrated some improvement with calibration. Calibration of projects categorized by contractor resulted in better calibrated model performance than calibration of projects categorized by application. This categorization is suggested for future consideration.
57.95 Out Of Stock
Calibration and Validation of the SAGE Software Cost/Schedule Estimating System to United States Air Force Databases

Calibration and Validation of the SAGE Software Cost/Schedule Estimating System to United States Air Force Databases

by David B Marzo
Calibration and Validation of the SAGE Software Cost/Schedule Estimating System to United States Air Force Databases

Calibration and Validation of the SAGE Software Cost/Schedule Estimating System to United States Air Force Databases

by David B Marzo

Paperback

$57.95 
  • SHIP THIS ITEM
    Temporarily Out of Stock Online
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

This research entailed calibration and validation of the SAGE Software Cost/Schedule Estimating System, Version 1.7 as a means to improve estimating accuracy for DOD software-intensive systems, and thereby introduce stability into software system development. SAGE calibration consisted of using historical data from completed projects at the Space and Missile Systems Center (SMC) and the Electronic Systems Center (ESC) to derive average performance factors (i.e., calibration factors) for pre-defined categories of projects. A project was categorized for calibration by either its primary application or by the contractor that developed it. The intent was to determine the more appropriate categorization for calibration. SAGE validation consisted of using the derived calibration factors to predict completed efforts, not used in deriving the factors. Statistical resampling employing Monte Carlo simulation was used to calibrate and validate the model on each possible combination of a category's projects. Three statistical measures were employed to measure model performance in default and calibrated estimating modes. SAGE generally did not meet pre-established criteria for estimating accuracy, although the model demonstrated some improvement with calibration. Calibration of projects categorized by contractor resulted in better calibrated model performance than calibration of projects categorized by application. This categorization is suggested for future consideration.

Product Details

ISBN-13: 9781288285853
Publisher: Biblioscholar
Publication date: 11/12/2012
Pages: 130
Product dimensions: 7.44(w) x 9.69(h) x 0.28(d)
From the B&N Reads Blog

Customer Reviews