Measures Of Interobserver Agreement

Hardcover (Print)
Used and New from Other Sellers
Used and New from Other Sellers
from $44.73
Usually ships in 1-2 business days
(Save 54%)
Other sellers (Hardcover)
  • All (7) from $44.73   
  • New (2) from $176.45   
  • Used (5) from $44.73   
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any coupons and promotions
Seller since 2011

Feedback rating:



New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

Brand new and unread! Join our growing list of satisfied customers!

Ships from: Phoenix, MD

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Seller since 2008

Feedback rating:


Condition: New

Ships from: Chicago, IL

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Sort by


Agreement among at least two evaluators is an issue of prime importance to statisticians, clinicians, epidemiologists, psychologists, and many other scientists. Measuring interobserver agreement is a method used to evaluate inconsistencies in findings from different evaluators who collect the same or similar information. Highlighting applications over theory, Measure of Interobserver Agreement provides a comprehensive survey of this method and includes standards and directions on how to run sound reliability and agreement studies in clinical settings and other types of investigations.

The author clearly explains how to reduce measurement error, presents numerous practical examples of the interobserver agreement approach, and emphasizes measures of agreement among raters for categorical assessments. The models and methods are considered in two different but closely related contexts: 1) assessing agreement among several raters where the response variable is continuous and 2) where there is a prior decision by the investigators to use categorical scales to judge the subjects enrolled in the study. While the author thoroughly discusses the practical and theoretical issues of case 1, a major portion of this book is devoted to case 2. He explores issues such as two raters randomly judging a group of subjects, interrater bias and its connection to marginal homogeneity, and statistical issues in determining sample size.

Statistical analysis of real and hypothetical datasets are presented to demonstrate the various applications of the models in repeatability and validation studies. To help with problem solving, the monograph includes SAS code, both within the book and on the CRC Web site. The author presents information with the right amount mathematical details, making this a cohesive book that reflects new research and the latest developments in the field.

Read More Show Less

Product Details

Table of Contents

INTRODUCTION RELIABILITY FOR CONTINUOUS SCALE MEASUREMENTS Model for Reliability Studies Inference Procedures on the Index of Reliability for Case (1)
Analysis of Method - Comparison Studies Comparing Reliability Coefficients MEASURES OF 2x2 ASSOCIATION AND AGREEMENT OF CROSS CLASSIFIED DATA Introduction Indices of Adjusted Agreement Cohen's Kappa =Chance Corrected Measure of Agreement Intraclass Kappa The 2x2 Kappa in the Context of Association Stratified Kappa Conceptual issues COEFFICIENTS OF AGREEMENT FOR MULTIPLE RATERS AND MULTIPLE CATEGORIES Introduction Multiple Categories and Two Raters Agreement for Multiple Raters and Dichotomous Classification Multiple Raters and Multiple Categories Testing the Homogeneity of Kappa Statistic from Independent Studies ASSESSING AGREEMENT FROM DEPENDENT Introduction Dependent Dichotomous Assessments Adjusting for Covariates Likelihood Based Approach Estimating Equations Approach Loglinear and Association Models Appendix I: Joint probability distribution of repeated dichotomous assessments Appendix II: Correlation between estimated kappas SAMPLE SIZE REQUIREMENTS FOR THE DESIGN OF RELIABILITY STUDY Introduction The Case of Continuous Measurements The Non-Normal Case Cost Implications The Case of Dichotomous Assessments Bibliography

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star


4 Star


3 Star


2 Star


1 Star


Your Rating:

Your Name: Create a Pen Name or

Barnes & Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation


  • - By submitting a review, you grant to Barnes & and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Terms of Use.
  • - Barnes & reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)