Image Analysis, Random Fields and Dynamic Monte Carlo Methods: A Mathematical Introduction

Image Analysis, Random Fields and Dynamic Monte Carlo Methods: A Mathematical Introduction

by Gerhard Winkler

Paperback(Softcover reprint of the original 1st ed. 1995)

$129.00
Eligible for FREE SHIPPING
  • Want it by Monday, October 22?   Order by 12:00 PM Eastern and choose Expedited Shipping at checkout.

Overview

Image Analysis, Random Fields and Dynamic Monte Carlo Methods: A Mathematical Introduction by Gerhard Winkler

This text is concerned with a probabilistic approach to image analysis as initiated by U. GRENANDER, D. and S. GEMAN, B.R. HUNT and many others, and developed and popularized by D. and S. GEMAN in a paper from 1984. It formally adopts the Bayesian paradigm and therefore is referred to as 'Bayesian Image Analysis'. There has been considerable and still growing interest in prior models and, in particular, in discrete Markov random field methods. Whereas image analysis is replete with ad hoc techniques, Bayesian image analysis provides a general framework encompassing various problems from imaging. Among those are such 'classical' applications like restoration, edge detection, texture discrimination, motion analysis and tomographic reconstruction. The subject is rapidly developing and in the near future is likely to deal with high-level applications like object recognition. Fascinating experiments by Y. CHOW, U. GRENANDER and D.M. KEENAN (1987), (1990) strongly support this belief.

Product Details

ISBN-13: 9783642975240
Publisher: Springer Berlin Heidelberg
Publication date: 01/19/2012
Series: Stochastic Modelling and Applied Probability , #27
Edition description: Softcover reprint of the original 1st ed. 1995
Pages: 324
Product dimensions: 6.10(w) x 9.25(h) x 0.03(d)

Table of Contents

I. Bayesian Image Analysis: Introduction.- 1. The Bayesian Paradigm.- 1.1 The Space of Images.- 1.2 The Space of Observations.- 1.3 Prior and Posterior Distribution.- 1.4 Bayesian Decision Rules.- 2. Cleaning Dirty Pictures.- 2.1 Distortion of Images.- 2.1.1 Physical Digital Imaging Systems.- 2.1.2 Posterior Distributions.- 2.2 Smoothing.- 2.3 Piecewise Smoothing.- 2.4 Boundary Extraction.- 3. Random Fields.- 3.1 Markov Random Fields.- 3.2 Gibbs Fields and Potentials.- 3.3 More on Potentials.- II. The Gibbs Sampler and Simulated Annealing.- 4. Markov Chains: Limit Theorems.- 4.1 Preliminaries.- 4.2 The Contraction Coefficient.- 4.3 Homogeneous Markov Chains.- 4.4 Inhomogeneous Markov Chains.- 5. Sampling and Annealing.- 5.1 Sampling.- 5.2 Simulated Annealing.- 5.3 Discussion.- 6. Cooling Schedules.- 6.1 The ICM Algorithm.- 6.2 Exact MAPE Versus Fast Cooling.- 6.3 Finite Time Annealing.- 7. Sampling and Annealing Revisited.- 7.1 A Law of Large Numbers for Inhomogeneous Markov Chains.- 7.1.1 The Law of Large Numbers.- 7.1.2 A Counterexample.- 7.2 A General Theorem.- 7.3 Sampling and Annealing under Constraints.- 7.3.1 Simulated Annealing.- 7.3.2 Simulated Annealing under Constraints.- 7.3.3 Sampling with and without Constraints.- III. More on Sampling and Annealing.- 8. Metropolis Algorithms.- 8.1 The Metropolis Sampler.- 8.2 Convergence Theorems.- 8.3 Best Constants.- 8.4 About Visiting Schemes.- 8.4.1 Systematic Sweep Strategies.- 8.4.2 The Influence of Proposal Matrices.- 8.5 The Metropolis Algorithm in Combinatorial Optimization.- 8.6 Generalizations and Modifications.- 8.6.1 Metropolis-Hastings Algorithms.- 8.6.2 Threshold Random Search.- 9. Alternative Approaches.- 9.1 Second Largest Eigenvalues.- 9.1.1 Convergence Reproved.- 9.1.2 Sampling and Second Largest Eigenvalues.- 9.1.3 Continuous Time and Space.- 10. Parallel Algorithms.- 10.1 Partially Parallel Algorithms.- 10.1.1 Synchroneous Updating on Independent Sets.- 10.1.2 The Swendson-Wang Algorithm.- 10.2 Synchroneous Algorithms.- 10.2.1 Introduction.- 10.2.2 Invariant Distributions and Convergence.- 10.2.3 Support of the Limit Distribution.- 10.3 Synchroneous Algorithms and Reversibility.- 10.3.1 Preliminaries.- 10.3.2 Invariance and Reversibility.- 10.3.3 Final Remarks.- IV. Texture Analysis.- 11. Partitioning.- 11.1 Introduction.- 11.2 How to Tell Textures Apart.- 11.3 Features.- 11.4 Bayesian Texture Segmentation.- 11.4.1 The Features.- 11.4.2 The Kolmogorov-Smirnov Distance.- 11.4.3 A Partition Model.- 11.4.4 Optimization.- 11.4.5 A Boundary Model.- 11.5 Julesz’s Conjecture.- 11.5.1 Introduction.- 11.5.2 Point Processes.- 12. Texture Models and Classification.- 12.1 Introduction.- 12.2 Texture Models.- 12.2.1 The ?-Model.- 12.2.2 The Autobinomial Model.- 12.2.3 Automodels.- 12.3 Texture Synthesis.- 12.4 Texture Classification.- 12.4.1 General Remarks.- 12.4.2 Contextual Classification.- 12.4.3 MPM Methods.- V. Parameter Estimation.- 13. Maximum Likelihood Estimators.- 13.1 Introduction.- 13.2 The Likelihood Function.- 13.3 Objective Functions.- 13.4 Asymptotic Consistency.- 14. Spacial ML Estimation.- 14.1 Introduction.- 14.2 Increasing Observation Windows.- 14.3 The Pseudolikelihood Method.- 14.4 The Maximum Likelihood Method.- 14.5 Computation of ML Estimators.- 14.6 Partially Observed Data.- VI. Supplement.- 15. A Glance at Neural Networks.- 15.1 Introduction.- 15.2 Boltzmann Machines.- 15.3 A Learning Rule.- 16. Mixed Applications.- 16.1 Motion.- 16.2 Tomographic Image Reconstruction.- 16.3 Biological Shape.- VII. Appendix.- A. Simulation of Random Variables.- A.1 Pseudo-random Numbers.- A.2 Discrete Random Variables.- A.3 Local Gibbs Samplers.- A.4 Further Distributions.- A.4.1 Binomial Variables.- A.4.2 Poisson Variables.- A.4.3 Gaussian Variables.- A.4.4 The Rejection Method.- A.4.5 The Polar Method.- B. The Perron-Frobenius Theorem.- C. Concave Functions.- D. A Global Convergence Theorem for Descent Algorithms.- References.

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews