Random Processes for Image and Signal Processing / Edition 1

Random Processes for Image and Signal Processing / Edition 1

by Edward R. Dougherty
     
 

ISBN-10: 0819425133

ISBN-13: 9780819425133

Pub. Date: 10/27/1998

Publisher: SPIE Press

"This book gives readers an intuitive appreciation for random functions, plus theory and processes necessary for sophisticated applications. It covers probability theory, random processes, canonical representation, optimal filtering, and random models. Second in the SPIE/IEEE Series on Imaging Science & Engineering. It also presents theory along with

Overview

"This book gives readers an intuitive appreciation for random functions, plus theory and processes necessary for sophisticated applications. It covers probability theory, random processes, canonical representation, optimal filtering, and random models. Second in the SPIE/IEEE Series on Imaging Science & Engineering. It also presents theory along with applications, to help readers intuitively appreciate random functions.

Included are special cases in which probabilistic insight is more readily achievable. When provided, proofs are in the main body of the text and clearly delineated; sometimes they are either not provided or outlines of conceptual arguments are given. The intent is to state theorems carefully and to draw clear distinctions between rigorous mathematical arguments and heuristic explanations. When a proof can be given at a mathematical level commensurate with the text and when it enhances conceptual understanding, it is usually provided; in other cases, the effort is to explain subtleties of the definitions and properties concerning random functions, and to state conditions under which a proposition applies. Attention is drawn to the differences between deterministic concepts and their random counterparts, for instance, in the mean-square calculus, orthonormal representation, and linear filtering. Such differences are sometimes glossed over in method books; however, lack of differentiation between random and deterministic analysis can lead to misinterpretation of experimental results and misuse of techniques.

The author's motivation for the book comes from his experience in teaching graduate-level image processing and having to end up teaching random processes. Even students who have taken a course on random processes have often done so in the context of linear operators on signals. This approach is inadequate for image processing. Nonlinear operators play a widening role in image processing, and the spatial nature of imaging makes it significantly different from one-dimensional signal processing. Moreover, students who have some background in stochastic processes often lack a unified view in terms of canonical representation and orthogonal projections in inner product spaces."

Product Details

ISBN-13:
9780819425133
Publisher:
SPIE Press
Publication date:
10/27/1998
Series:
Press Monographs
Edition description:
New Edition
Pages:
616
Product dimensions:
7.41(w) x 10.32(h) x 1.47(d)

Table of Contents

Chapter 1.Probability Theory
1.1Probability Space
1.1.1.Events
1.1.2.Conditional Probability
1.2.Random Variables
1.2.1.Probability Distributions
1.2.2.Probability Densities
1.2.3.Functions of a Random Variable
1.3.Moments
1.3.1.Expectation and Variance
1.3.2.Moment-Generating Function
1.4.Important Probability Distributions
1.4.1.Binomial Distribution
1.4.2.Poisson Distribution
1.4.3.Normal Distribution
1.4.4.Gamma Distribution
1.4.5.Beta Distribution
1.4.6.Computer Simulation
1.5.Multivariate Distributions
1.5.1.Jointly Distributed Random Variables
1.5.2.Conditioning
1.5.3.Independence
1.6.Functions of Several Random Variables
1.6.1.Basic Arithmetic Functions of Two Random Variables
1.6.2.Distributions of Sums of Independent Random Variables
1.6.3.Joint Distributions of Output Random Variables
1.6.4.Expectation of a Function of Several Random Variables
1.6.5.Covariance
1.6.6.Multivariate Normal Distribution
1.7.Laws of Large Numbers
1.7.1.Weak Law of Large Numbers
1.7.2.Strong Law of Large Numbers
1.7.3.Central Limit Theorem
1.8.Parametric Estimation via Random Samples
1.8.1.Random-Sample Estimators
1.8.2.Sample Mean and Sample Variance
1.8.3.Minimum-Variance Unbiased Estimators
1.8.4.Method of Moments
1.8.5.Order Statistics
1.9.Maximum-Likelihood Estimation
1.9.1.Maximum-Likelihood Estimators
1.9.2.Additive Noise
1.9.3.Minimum Noise
1.10Entropy
1.10.1.Uncertainty
1.10.2.InformationEntropy of a Random Vector
1.11Source Coding
1.11.1Prefix Codes
1.11.2Optimal Coding
Exercises for Chapter 1
Chapter 2.Random Processes
2.1.Random Functions
2.2.Moments of a Random Function
2.2.1.Mean and Covariance Functions
2.2.2.Mean and Covariance of a Sum
2.3.Differentiation
2.3.1.Differentiation of Random Functions
2.3.2.Mean-Square Differentiability
2.4.Integration
2.5.Mean Ergodicity
2.6.Poisson Process
2.6.1.One-dimensional Poisson Model
2.6.2.Derivative of the Poisson Process
2.6.3.Properties of Poisson Points
2.6.4.Axiomatic Formulation of Poisson Temporal and Spatial Processes
2.7.Wiener Process and White Noise
2.7.1.White Noise
2.7.2.Random Walk
2.7.3.Wiener Process
2.8.Stationarity
2.8.1.Wide-Sense Stationarity
2.8.2.Mean-Ergodicity for WS Stationary Processes
2.8.3.Covariance-Ergodicity for WS Stationary Processes
2.8.4.Strict-Sense Stationarity
2.9.Estimation
2.10.Linear Systems
2.10.1.Communication of a Linear Operator with Expectation
2.10.2.Representation of Linear Operators
Exercises for Chapter 2
Chapter 3.Canonical Representation
3.1.Canonical Expansions
3.1.1.Fourier Representation and Projections
3.1.2.Canonical Expansion of the Covariance Function
3.2.Karhunen-Loeve Expansion
3.2.1The Karhunen-Loeve Theorem
3.2.2Discrete Karhunen-Loeve Expansion
3.2.3Canonical Expansions with Orthonormal Coordinate Functions
3.2.4Relation to Data Compression
3.3.Noncanonical Representation
3.3.1Generalized Beseel Inequality
3.3.2Decorrelation
3.4.Trigonometric Representation
3.4.1.Trigonometric Fourier Series
3.4.2.Generalized Fourier Coefficients for WS Stationary Processes
3.4.3.Mean-Square Periodic WS Stationary Processes
3.5.Canonical Expansions as Transforms
3.5.1.Orthonormal Transforms of Random Functions
3.5.2.Fourier Descriptors
3.6.Transform Coding
3.6.1.Karhunen-Loeve Compression
3.6.2.Transform Compression Using an Arbitrary Orthonormal System
3.6.3.Walsh-Hadamard Transform
3.6.4.Discrete Cosine Transform
3.6.5.Transform Coding for Digital Images
3.6.6.Optimality of the Karhunen-Loeve Transform
3.7.Coefficients Generated by Linear Functionals
3.7.1.Coefficients From Integral Functionals
3.7.2.Generating Bi-Orthogonal Function Systems
3.8.Canonical Expansion of the Covariance Function
3.8.1.Derivation of a Canonical Expansion from a Covariance Expansion
3.8.2.Constructing a Canonical Expansion for the Covariance Function
3.9.Integral Canonical Expansions
3.9.1.Construction via Integral Functional Coefficients
3.9.2.Construction from a Covariance Expansion
3.10.Power Spectral Density
3.10.1.The Power-Spectral-Density/Covariance Transform Pair
3.10.2.Power Spectral Density and Linear Operators
3.10.3.Integral Canonical Representation of a WS Stationary Random Function
3.11.Canonical Expansions of Vector Random Functions
3.11.1.Vector Random Functions
3.11.2.Canonical Expansions for Vector Random Functions
3.11.3.Finite Sets of Random Vectors
3.12.Canonical Representation over a Discrete Set
Exercises for Chapter 3
Chapter 4.Optimal Filtering
4.1.Optimal Mean-Square-Error Filters
4.1.1.Conditional Expectation
4.1.2.Optimal Nonlinear Filter
4.1.3.Optimal Filter for Jointly Normal Random Variables
4.1.4.Multiple Observation Variables
4.1.5.Bayesian Parametric Estimation
4.2.Optimal Finite-Observation Linear Filters
4.2.1.Linear Filters and the Orthogonality Principle
4.2.2.Design of the Optimal Linear Filter
4.2.3.Optimal Linear Filter in the Jointly Gaussian Case
4.2.4.Role of Wide-Sense Stationarity
4.2.5.Signal-plus-Noise Model
4.2.6.Edge Detection
4.3.Steepest Descent
4.3.1.Steepest Descent Iterative Algorithm
4.3.2.Convergence of the Steepest-Descent Algorithm
4.3.3.Least-Mean-Square Adaptive Algorithm
4.3.4.Convergence of the LMS Algorithm
4.3.5.Nonstationary Processes
4.4.Least-Squares Estimation
4.4.1.Pseudoinverse Estimator
4.4.2.Least-Squares Estimation for Nonwhite Noise
4.4.3.Multiple Linear Regression
4.4.4.Least-Squares Image Restoration
4.5.Optimal Linear Filters for Random Vectors
4.5.1.Optimal Linear Filter for Linearly Dependent Observations
4.5.2.Optimal Estimation of Random Vectors
4.5.3.Optimal Linear Filters for Random Vectors
4.6.Recursive Linear Filters
4.6.1.Recursive Generation of Direct Sums
4.6.2.Static Recursive Optimal Linear Filtering
4.6.3.Dynamic Recursive Optimal Linear Filtering
4.7.Optimal Infinite-Observation Linear Filters
4.7.1.Wiener-Hopf Equation
4.7.2.Wiener Filter
4.8.Optimal Linear Filters in the Context of a Linear Model
4.8.1.The Linear Signal Model
4.8.2.Procedure for Finding the Optimal Linear Filter
4.8.3.Additive White Noise
4.8.4.Discrete Domains
4.9.Optimal Linear Filters via Canonical Expansions
4.9.1.Integral Decomposition into White Noise
4.9.2.Integral Equations Involving the Autocorrelation Function
4.9.3.Solution Via Discrete Canonical Expansions
4.10.Optimal Binary Filters
4.10.1.Binary Conditional Expectation
4.10.2.Boolean Functions and Optimal Translation-Invariant Filters
4.10.3.Optimal Increasing Filters
4.11.Pattern Classification
4.11.1.Optimal Classifiers
4.11.2.Gaussian Maximum-Likelihood Classification
4.11.3.Linear Discriminants
4.12.Neural Networks
4.12.1.Two-Layer Neural Networks
4.12.2.Steepest Descent for Nonquadratic Error Surfaces
4.12.3.Sum-of-Squares Error
4.12.4.Error Back-Propagation
4.12.5.Error Back-Propagation for Multiple Outputs
4.12.6.Adaptive Network Design
Exercises for Chapter 4
Chapter 5.Random Models
5.1.Markov Chains
5.1.1.Chapman-Kolmogorov Equations
5.1.2.Transition Probability Matrix
5.1.3.Markov Processes
5.2.Steady-State Distributions for Discrete-Time Markov Chains
5.2.1.Long-Run Behavior of a Two-State Markov Chain
5.2.2.Classification of States
5.2.3.Steady-State and Stationary Distributions
5.2.4.Long-Run Behavior of Finite Markov Chains
5.2.5.Long-Run Behavior of Markov Chains with Infinite State Spaces
5.3.Steady-State Distributions for Continuous-Time Markov Chains
5.3.1.Steady-State Distribution of an Irreducible Continuous-Time Markov Chain
5.3.2.Birth-Death Model Queues
5.3.3.Forward and Backward Kolmogorov Equations
5.4.Markov Random Fields
5.4.1.Neighborhood Systems
5.4.2.Determination by Conditional Probabilities
5.4.3.Gibbs Distributions
5.5.Random Boolean Model
5.5.1.Germ-Grain Model
5.5.2.Vacancy
5.5.3.Hitting
5.5.4.Linear Boolean Model
5.6.Granulometries
5.6.1.Openings
5.6.2.Classification by Granulometric Moments
5.6.3.Adaptive Reconstructive Openings
5.7.Random Sets
5.7.1.Hit-or-Miss Topology
5.7.2.Convergence and Continuity
5.7.3.RACS
5.7.4.Capacity Functional
Exercises for Chapter 5
Bibliography

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >