Data Collection: Planning for and Collecting All Types of Data / Edition 1

Paperback (Print)
Used and New from Other Sellers
Used and New from Other Sellers
from $27.91
Usually ships in 1-2 business days
(Save 35%)
Other sellers (Paperback)
  • All (7) from $27.91   
  • New (4) from $27.91   
  • Used (3) from $31.71   

Overview

Data Collection

Data Collection is the second of six books in the Measurement and Evaluation Series from Pfeiffer. The proven ROI Methodology—developed by the ROI Institute—provides a practical system for evaluation planning, data collection, data analysis, and reporting. All six books in the series offer the latest tools, most current research, and practical advice for measuring ROI in a variety of settings.

Data Collection offers an effective process for collecting data that is essential to the implementation of the ROI Methodology. The authors outline the techniques, processes, and critical issues involved in successful data collection. The book examines the various methods of data collection, including questionnaires, interviews, focus groups, observation, action plans, performance contracts, and monitoring records. Written for evaluators, facilitators, analysts, designers, coordinators, and managers, Data Collection is a valuable guide for collecting data that are adequate in quantity and quality to produce a complete and credible analysis.

Read More Show Less

Product Details

  • ISBN-13: 9780787987183
  • Publisher: Wiley
  • Publication date: 1/2/2008
  • Series: Measurement and Evaluation Series , #1
  • Edition description: New Edition
  • Edition number: 1
  • Pages: 192
  • Sales rank: 984,083
  • Product dimensions: 6.04 (w) x 9.07 (h) x 0.48 (d)

Meet the Author

Patricia Pulliam Phillips is an internationally recognized author, consultant, and president and CEO of the ROI Institute, Inc. Phillips provides consulting services to organizations worldwide. She helps organizations build capacity in the ROI Methodology by facilitating the ROI certification process and teaching the ROI Methodology through workshops and graduate-level courses.

Cathy A. Stawarski is program manager of the Strategic Performance Improvement and Evaluation program at the Human Resources Research Organization (HumRRO) in Alexandria, Virginia. She has more than twenty-five years of experience in research, training and development, and program evaluation. Throughout her nearly twenty years at HumRRO, she has worked primarily with clients in the federal sector. Her work includes leading and conducting the evaluation of leadership and human capital initiatives as well as assisting organizations in developing comprehensive evaluation strategies.

The ROI Institute, Inc., is a benchmarking, research, and information sharing organization that provides consulting services, workshops, and certification in the ROI Methodology. Widely considered the leading authority on evaluation and measurement of learning and development in organizations, the ROI Institute conducts workshops and offers certification for thousands of practitioners through a variety of strategic partners.

Read More Show Less

Table of Contents

Principles of the ROI Methodology.

1. Using Questionnaires and Surveys.

Types of Questions.

Questionnaire Design Steps.

Pager: Please do not italicize the Contents H1 items or the sublists to this level.

Determine the Specific Information Needed.

Involve Stakeholders in the Process.

Select the Types of Questions.

Develop the Questions.

Check the Reading Level.

Test the Questions.

Address the Anonymity Issue.

Design for Ease of Tabulation and Analysis.

Develop the Completed Questionnaire and Prepare a Data Summary.

Improving the Response Rate for Questionnaires and Surveys.

Provide Advance Communication.

Communicate the Purpose.

Describe the Data Integration Process.

Keep the Questionnaire as Simple as Possible.

Simplify the Response Process.

Use Local Manager Support.

Let the Participants Know That They Are Part of a Sample.

Consider Incentives.

Have an Executive Sign the Introductory Letter.

Use Follow-Up Reminders.

Send a Copy of the Results to the Participants.

Review the Questionnaire with Participants.

Consider a Captive Audience.

Communicate the Timing of Data Flow.

Select the Appropriate Media.

Consider Anonymous or Confidential Input.

Pilot Test the Questionnaire.

Explain How Long Completing the Questionnaire Will Take.

Personalize the Process.

Provide an Update.

Final Thoughts.

2. Using Tests.

Types of Tests.

Norm-Referenced Tests.

Criterion-Referenced Tests.

Performance Tests.

Simulations.

Electromechanical Simulation.

Task Simulation.

Business Games.

In-Basket Simulation.

Case Study.

Role-Playing.

Informal Tests.

Exercises, Problems, or Activities.

Self-Assessment.

Facilitator Assessment.

Final Thoughts.

3. Using Interviews, Focus Groups, and Observation.

Interviews.

Types of Interviews.

Interview Guidelines.

Pager: Please style the following as a sublist to the previous list.

Develop the Questions to Be Asked.

Test the Interview.

Prepare the Interviewers.

Provide Clear Instructions to the Participants.

Schedule the Interviews.

Pager: end of sublist.

Focus Groups.

Applications of Focus Groups.

Guidelines.

Pager: Please style the following as a sublist to the previous list.

Plan Topics, Questions, and Strategy Carefully.

Keep the Group Size Small.

Use a Representative Sample.

Use Experienced Facilitators.

Pager: end of sublist.

Observations.

Guidelines for Effective Observation.

Pager: Please style the following as a sublist to the previous list.

Observations Should Be Systematic.

Observers Should Be Knowledgeable.

The Observer’s Influence Should Be Minimized.

Observers Should Be Selected Carefully.

Observers Must Be Fully Prepared.

Pager: end of sublist.

Observation Methods.

Pager: Please style the following as a sublist to the previous list.

Behavior Checklist.

Delayed Report.

Video Recording.

Audio Monitoring.

Computer Monitoring.

Pager: end of sublist.

Final Thoughts.

4. Using Other Data Collection Methods.

Business Performance Monitoring.

Using Current Measures.

Pager: Please style the following as a sublist to the previous list.

Identify Appropriate Measures.

Convert Current Measures to Usable Ones.

Pager: end of sublist.

Developing New Measures.

Action Planning.

Developing an Action Plan.

Using Action Plans Successfully.

Pager: Please style the following as a sublist to the previous list.

Communicate the Action Plan Requirement Early.

Describe the Action Planning Process at the Beginning of the Program.

Teach the Action Planning Process.

Allow Time to Develop the Plan.

Have the Facilitator Approve Action Plans.

Require Participants to Assign a Monetary Value to Each Improvement.

Ask Participants to Isolate the Effects of the Program.

Ask Participants to Provide a Confidence Level for Estimates.

Require That Action Plans Be Presented to the Group.

Explain the Follow-Up Process.

Collect Action Plans at the Stated Follow-Up Time.

Summarize the Data and Calculate the ROI.

Pager: end of sublist.

Applying Action Plans.

Identifying Advantages and Disadvantages of Action Plans.

Performance Contracts.

Final Thoughts.

5. Measuring Reaction and Planned Action.

Why Measure Reaction and Planned Action?

Customer Satisfaction.

Immediate Adjustments.

Team Evaluation.

Predictive Capability.

Importance of Other Levels of Evaluation.

Areas of Feedback.

Data Collection Issues.

Timing.

Methods.

Administrative Guidelines.

Uses of Reaction Data.

Final Thoughts.

6. Measuring Learning and Confidence.

Why Measure Learning and Confidence?

The Learning Organization.

Compliance Issues.

Development of Competencies.

Certification.

Consequences of an Unprepared Workforce.

The Role of Learning in Programs.

Measurement Issues.

Challenges.

Program Objectives.

Typical Measures.

Timing.

Data Collection Methods.

Administrative Issues.

Validity and Reliability.

Consistency.

Pilot Testing.

Scoring and Reporting.

Confronting Failure.

Uses of Learning Data.

Final Thoughts.

7. Measuring Application and Implementation.

Why Measure Application and Implementation?

Obtain Essential Information.

Track Program Focus.

Discover Problems and Opportunities.

Reward Effectiveness.

Challenges.

Linking Application with Learning.

Building Data Collection into the Program.

Ensuring a Sufficient Amount of Data.

Addressing Application Needs at the Outset.

Measurement Issues.

Methods.

Objectives.

Areas of Coverage.

Data Sources.

Timing.

Responsibilities.

Data Collection Methods.

Questionnaires.

Pager: Please style the following as a sublist to the previous list.

Progress with Objectives.

Use of Program Materials and Handouts.

Application of Knowledge and Skills.

Changes in Work Activities.

Improvements or Accomplishments.

Definition of the Measure.

Amount of Change.

Unit Value.

Basis for Value.

Total Annual Impact.

Other Factors.

Improvements Linked with the Program.

Confidence Level.

Perception of Investment in the Program.

Link with Output Measures.

Other Benefits.

Barriers.

Enablers.

Management Support.

Other Solutions.

Target Audience Recommendations.

Suggestions for Improvement.

Pager: end of sublist.

Interviews, Focus Groups, and Observation.

Action Plans.

Barriers to Application.

Uses of Application Data.

Final Thoughts.

8. Measuring Impact and Consequences.

Why Measure Business Impact?

Impact Data Provide Higher-Level Information on Performance.

Impact Data Represent the Business Driver of a Program.

Impact Data Provide Value for Sponsors.

Impact Data Are Easy to Measure.

Effective Impact Measures.

Hard Data Measures.

Soft Data Measures.

Tangible Versus Intangible Measures.

Impact Objectives.

Linking Specific Measures to Programs.

Sources of Impact Data.

Data Collection Methods.

Monitoring Business Performance Data.

Pager: Please style the following as a sublist to the previous list.

Identify Appropriate Measures.

Convert Current Measures to Usable Ones.

Develop New Measures.

Pager: end of sublist.

Action Plans.

Pager: Please style the following as a sublist to the previous list.

Set Goals and Targets.

Define the Unit of Measure.

Place a Monetary Value on Each Improvement.

Implement the Action Plan.

Document Specific Improvements.

Isolate the Effects of the Program.

Provide a Confidence Level for Estimates.

Collect Action Plans at Specified Time Intervals.

Summarize the Data and Calculate the ROI.

Pager: end of sublist.

Performance Contracts.

Questionnaires.

Final Thoughts.

9 Selecting the Proper Data Collection Method.

Matching Exercise.

Selecting the Appropriate Method for Each Level.

Type of Data.

Investment of Participants? Time.

Investment of Managers? Time.

Cost.

Disruption of Normal Work Activities.

Accuracy.

Built-In Design Possibility.

Utility of an Additional Method.

Cultural Bias of Data Collection Method.

Final Thoughts.

Index.

About the Authors.

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)