Subsampling

Overview

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.

...
See more details below
Paperback (Softcover reprint of the original 1st ed. 1999)
$115.28
BN.com price
(Save 13%)$134.00 List Price
Other sellers (Paperback)
  • All (4) from $107.44   
  • New (3) from $107.44   
  • Used (1) from $171.10   
Sending request ...

Overview

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.

Read More Show Less

Editorial Reviews

From the Publisher
"The book is one of the most comprehensive texts in the subsampling realm and provides a solid background for researchers working in the related areas of statistics."
V.V. Fedorov in "Short Book Reviews", Vol. 21/1, April 2001
Read More Show Less

Product Details

  • ISBN-13: 9781461271901
  • Publisher: Springer New York
  • Publication date: 4/30/2013
  • Series: Springer Series in Statistics
  • Edition description: Softcover reprint of the original 1st ed. 1999
  • Edition number: 1
  • Pages: 348
  • Product dimensions: 6.14 (w) x 9.21 (h) x 0.76 (d)

Table of Contents

I Basic Theory.- 1 Bootstrap Sampling Distributions.- 1.1 Introduction.- 1.1.1 Pivotal Method.- 1.1.2 Asymptotic Pivotal Method.- 1.1.3 Asymptotic Approximation.- 1.1.4 Bootstrap Approximation.- 1.2 Consistency.- 1.3 Case of the Nonparametric Mean.- 1.4 Generalizations to Mean-like Statistics.- 1.5 Bootstrapping the Empirical Process.- 1.6 Differentiability and the Bootstrap.- 1.7 Further Examples.- 1.8 Hypothesis Testing.- 1.9 Conclusions.- 2 Subsampling in the I.I.D. Case.- 2.1 Introduction.- 2.2 The Basic Theorem.- 2.3 Comparison with the Bootstrap.- 2.4 Shastic Approximation.- 2.5 General Parameters and Other Choices of Root.- 2.5.1 Studentized Roots.- 2.5.2 General Parameter Space.- 2.6 Hypothesis Testing.- 2.7 Data-Dependent Choice of Block Size.- 2.8 Variance Estimation: The Delete-d Jackknife.- 2.9 Conclusions.- 3 Subsampling for Stationary Time Series.- 3.1 Introduction.- 3.2 Univariate Parameter Case.- 3.2.1 Some Motivation: The Simplest Example.- 3.2.2 Theory and Methods for the General Univariate Parameter Case.- 3.2.3 Studentized Roots.- 3.3 Multivariate Parameter Case.- 3.4 Examples.- 3.5 Hypothesis Testing.- 3.6 Data-Dependent Choice of Block Size.- 3.7 Bias Reduction.- 3.8 Variance Estimation.- 3.8.1 General Statistic Case.- 3.8.2 Case of the Sample Mean.- 3.9 Comparison with the Moving Blocks Bootstrap.- 3.10 Conclusions.- 4 Subsampling for Nonstationary Time Series.- 4.1 Introduction.- 4.2 Univariate Parameter Case.- 4.3 Multivariate Parameter Case.- 4.4 Examples.- 4.5 Hypothesis Testing and Data-Dependent Choice of Block Size.- 4.6 Variance Estimation.- 4.7 Conclusions.- 5 Subsampling for Random Fields.- 5.1 Introduction and Definitions.- 5.2 Some Useful Notions of Strong Mixing for Random Fields.- 5.3 Consistency of Subsampling for Random Fields.- 5.3.1 Univariate Parameter Case.- 5.3.2 Multivariate Parameter Case.- 5.4 Variance Estimation and Bias Reduction.- 5.5 Maximum Overlap Subsampling in Continuous Time.- 5.6 Some Illustrative Examples.- 5.7 Conclusions.- 6 Subsampling Marked Point Processes.- 6.1 Introduction.- 6.2 Definitions and Some Different Notions on Mixing.- 6.3 Subsampling Stationary Marked Point Processes.- 6.3.1 Sampling Setup and Assumptions.- 6.3.2 Main Consistency Result.- 6.3.3 Nonstandard Asymptotics.- 6.4 Shastic Approximation.- 6.5 Variance Estimation via Subsampling.- 6.6 Examples.- 6.7 Conclusions.- 7 Confidence Sets for General Parameters.- 7.1 Introduction.- 7.2 A Basic Theorem for the Empirical Measure.- 7.3 A General Theorem on Subsampling.- 7.4 Subsampling the Empirical Process.- 7.5 Subsampling the Spectral Measure.- 7.6 Conclusions.- II Extensions, Practical Issues, and Applications.- 8 Subsampling with Unknown Convergence Rate.- 8.1 Introduction.- 8.2 Estimation of the Convergence Rate.- 8.2.1 Convergence Rate Estimation: Univariate Parameter Case.- 8.2.2 Convergence Rate Estimation: Multivariate Parameter Case.- 8.3 Subsampling with Estimated Convergence Rate.- 8.4 Conclusions.- 9 Choice of the Block Size.- 9.1 Introduction.- 9.2 Variance Estimation.- 9.2.1 Case of the Sample Mean.- 9.2.2 General Case.- 9.3 Estimation of a Distribution function.- 9.3.1 Calibration Method.- 9.3.2 Minimum Volatility Method.- 9.4 Hypothesis Testing.- 9.4.1 Calibration Method.- 9.4.2 Minimum Volatility Method.- 9.5 Two Simulation Studies.- 9.5.1 Univariate Mean.- 9.5.2 Linear Regression.- 9.6 Conclusions.- 9.7 Tables.- 10 Extrapolation, Interpolation, and Higher-Order Accuracy.- 10.1 Introduction.- 10.2 Background.- 10.3 I.I.D. Data: The Sample Mean.- 10.3.1 Finite Population Correction.- 10.3.2 The Studentized Sample Mean.- 10.3.3 Estimation of a Two-Sided Distribution.- 10.3.4 Extrapolation.- 10.3.5 Robust Interpolation.- 10.4 I.I.D. Data: General Statistics.- 10.4.1 Extrapolation.- 10.4.2 Case of Unknown Convergence Rate to the Asymptotic Approximation.- 10.5 Strong Mixing Data.- 10.5.1 The Studentized Sample Mean.- 10.5.2 Estimation of a Two-Sided Distribution.- 10.5.3 The Unstudentized Sample Mean and the General Extrapolation Result.- 10.5.4 Finite Population Correction in the Mixing Case.- 10.5.5 Bias-Corrected Variance Estimation for Strong Mixing Data.- 10.6 Moderate Deviations in Subsampling Distribution Estimation.- 10.7 Conclusions.- 11 Subsampling the Mean with Heavy Tails.- 11.1 Introduction.- 11.2 Stable Distributions.- 11.3 Extension of Previous Theory.- 11.4 Subsampling Inference for the Mean with Heavy Tails.- 11.4.1 Appealing to a Limiting Stable Law.- 11.4.2 Using Self-Normalizing Sums.- 11.5 Choice of the Block Size.- 11.6 Simulation Study.- 11.7 Conclusions.- 11.8 Tables.- 12 Subsampling the Autoregressive Parameter.- 12.1 Introduction.- 12.2 Extension of Previous Theory.- 12.2.1 The Basic Method.- 12.2.2 Subsampling Studentized Statistics.- 12.3 Subsampling Inference for the Autoregressive Root.- 12.4 Choice of the Block Size.- 12.5 Simulation Study.- 12.6 Conclusions.- 12.7 Tables.- 13 Subsampling Sk Returns.- 13.1 Introduction.- 13.2 Background and Definitions.- 13.2.1 The GMM Approach.- 13.2.2 The VAR Approach.- 13.2.3 A Bootstrap Approach.- 13.3 The Subsampling Approach.- 13.4 Two Simulation Studies.- 13.4.1 Simulating VAR Data.- 13.4.2 Simulating Bootstrap Data.- 13.5 A New Look at Return Regressions.- 13.6 Additional Looks at Return Regressions.- 13.6.1 A Reorganization of Long-Horizon Regressions.- 13.6.2 A Joint Test for Multiple Return Horizons.- 13.7 Conclusions.- 13.8 Tables.- Appendices.- A Some Results on Mixing.- B A General Central Limit Theorem.- References.- Index of Names.- Index of Subjects.

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)