Model Selection and Inference: A Practical Information Theoretic Approach / Edition 1

Hardcover (Print)
Used and New from Other Sellers
Used and New from Other Sellers
from $66.80
Usually ships in 1-2 business days
(Save 16%)
Other sellers (Hardcover)
  • All (7) from $66.80   
  • New (3) from $105.00   
  • Used (4) from $66.80   
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any coupons and promotions
Seller since 2015

Feedback rating:



New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

Brand new.

Ships from: acton, MA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Seller since 2006

Feedback rating:


Condition: New
1998 Hard cover New. Sewn binding. Paper over boards. 320 p. Audience: General/trade. BRAND NEW. NO DJ. (as issued)

Ships from: Northbrook, IL

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
Seller since 2015

Feedback rating:


Condition: New
Brand New Item.

Ships from: Chatham, NJ

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
Page 1 of 1
Showing All
Sort by


This book is unique in that it covers the philosophy of model-based data analysis and a strategy for the analysis of empirical data. Model selection, under the information theoretic approach presented here, attempts to identify the (likely) best model, orders the models from best to worst, and measures the plausibility ("calibration") that each model is really the best as an inference. Model selection methods are extended to allow inference from more than a single "best" model. Several methods are given that allow the uncertainty as to which model is "best" to be incorporated into estimates of precision. An array of examples are given to illustrate various technical issues. This is an applied book written primarily for biologists and statisticians using models for making inferences from empirical data. Research biologists working either in the field or in the laboratory will find simple methods that are likely to be useful in their investigations. Applied statisticians will find the information theoretic methods presented here quite useful and a superior alternative, especially for observational studies. People interested in the empirical sciences will find this material useful as it offers an alternative to hypothesis testing and Bayesian approaches.
Read More Show Less

Product Details

  • ISBN-13: 9780387985046
  • Publisher: Springer-Verlag New York, LLC
  • Publication date: 1/28/1998
  • Edition description: Older Edition
  • Edition number: 1
  • Pages: 372
  • Product dimensions: 6.30 (w) x 9.46 (h) x 0.88 (d)

Table of Contents

1. Introduction
Objectives of Book
Background Material
Inference, Given a Model
The Critical Issue: "What Model To Use?"
Science Inputs - Formulation of Set of Candidate Models
Models Versus Full Reality
A "Best Approximating Model"
Overview of Models in the Biological Sciences
Types of Models Commonly Used in the Biological Sciences
Likelihood and Least Squares Theory
Data Dredging
Some Trends
Inference and the Principle of Parsimony
Over-fitting to Achieve a Good Model Fit
The Principle of Parsimony
Model Selection Methods
Model Selection Uncertainty
2. Information Theory and Log-Likelihood Models - A Basis for Model Selection and Inference
The Distance of Discrepancy Between Two Models
f, Truth, Full Reality and "True Models"
g, Approximating Models
Kullback-Leibler Distance (or Information)
Truth, f, Drops Out as a Constant
Akaike's Information Criterion
Akaike's Predictive Expected Log-Likelihood
Important Refinements to AIC
A Second Order AIC
Modification to AIC for Overdispersed Count Data
A Useful Analogy
Some History
The G-statistic and K-L Information
Further Insights
A Summary
Further Comments
A Heuristic Interpretation
Interpreting Differences Among AIC Values
Non-nested Models
Model Selection Uncertainty
AIC When Different Data Sets are to be Compared
Order Not Important in Computing AIC Values
Hypothesis Testing Is Still Important
Comparisons with Other Criteria
Information Criteria That Are Estimates of K-L Information
Criteria That Are Consistent for K
Return to Flather's Models
3. Practical Use of the Information Theoretic Approach
Computation and Interpretation of AIC Values
Example 1 - Cement Hardening Data
Set of Candidate Models
Some Results and Comparisons
A Summary
Example 2 - Time Distribution of an Insecticide Added to a Simulated Ecosystem
Set of Candidate Models
Some Results
Example 3 - Nestling Starlings
Experimental Scenario
Monte Carlo Data
Set of Candidate Models
Data Analysis Results
Further Insights Into the First 14 Nested Models
Hypothesis Testing and Information Theoretic Approaches Have Different Selection Frequencies
Further Insights Following Final Model Selection
Why Not Always Use the Global Model for Inference?
Example 4 - Sage Grouse Survival
Set of Candidate Models
Model Selection
Hypothesis Tests for Year-Dependent Survival Probabilities
Hypothesis Testing Versus AIC in Model Selection
A Class of Intermediate Models
Example 5 - Resource Utilization of Anolis Lizards
Set of Candidate Models
Comments on Analysis Method
Some Tentative Results
4. Model Selection Uncertainty with Examples
Methods for Assessing Model Selection Uncertainty
AIC Differences and a Confidence Set on the K-L Best Model
Likelihood for a Model and Akaike Weights
More Options for a Confidence Set on the K-L Best Model
&Dgr;_i, Model Selection Probabilities, and the Bootstrap
Concepts of Parameter Estimation and Model Selection Uncertainty
Including Model Selection Uncertainty in Estimator Sampling Variance
Unconditional Confidence Intervals
Uncertainty of Variable Selection
Model Redundancy
Cement Data
Anolis Lizards in Jamaica
Simulated Starling Experiment
Sage Grouse in North Park
Sakamoto et al.'s (1986) Simulated Data
Pine Wood Data
5. Monte Carlo Insights and Extended Examples
Survival Models
A Chain Binomial Survival Model
An Extended Survival Model
Further Survival Models
General Linear Model
[mention step up and step down and stepwise testing, all subsets, etc., etc.]
[K-L, vs. MSE vs. AIC and AICc]
Estimation of Density from Line Transect Sampling
Density Estimation Background
Line Transect Sampling of Kangaroos at Wallaby Creek
Analysis of Wallaby Creek Data
Bootstrap Analysis
6. Statistical Theory
Useful Preliminaries
A General Derivation of AIC
General K-L Based Model Selection: TIC
Analytical Computations of TIC
Bootstrap Estimation of TIC
AICc, a Second Order Improvement
Derivation of AICc
Lack of Uniqueness of AICc
Derivation of AIC for the Exponential Family of Distributions
Evaluation of tr(J(o)[I((o)]-1) and Its Estimator
Comparison of AIC vs. TIC in a Very Simple Setting
Evaluation Under Logistic Regression
Evaluation Under Multinomially Distributed Count Data
Evaluation Under Poisson Distributed Data
Evaluation for Fixed-effects Normality-Based Linear Models
Additional Results and Considerations
Selection Simulation for Nested Models
Simulation of the Distribution of ( p
Does AIC Over-fit?
Can Selection be Improved Based on all the ( i?
Linear Regression, AIC and Mean Square Error
AIC and Random Coeeficient Models
AICc and Models for Multivariate Data
There is No True TICc
Kullback-Leibler Information Relationship to the Fisher Information Matrix
Entropy and Jaynes MaxEnt Principle
Akaike Weights, (i, versus Selection Probabilities, (i
Model Goodness of Fit After Selection
Kullback-Leibler Information is Always >= 0
7. Summary
The Scientific Question and the Collection of Data
Actual Thinking and A Priori Modeling
The Basis of Objective Model Selection
The Principle of Parsimony
Information Criteria as Estimates of Relative Kullback-Leibler Information
Ranking and Calibrating Alternative Models
Model Selection Uncertainty
More on Inferences
Final Thoughts
8. References
9. Index
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star


4 Star


3 Star


2 Star


1 Star


Your Rating:

Your Name: Create a Pen Name or

Barnes & Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation


  • - By submitting a review, you grant to Barnes & and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Terms of Use.
  • - Barnes & reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)