Random Processes for Image and Signal Processing / Edition 1

Hardcover (Print)
Not Available on BN.com


"This book gives readers an intuitive appreciation for random functions, plus theory and processes necessary for sophisticated applications. It covers probability theory, random processes, canonical representation, optimal filtering, and random models. Second in the SPIE/IEEE Series on Imaging Science & Engineering. It also presents theory along with applications, to help readers intuitively appreciate random functions.

Included are special cases in which probabilistic insight is more readily achievable. When provided, proofs are in the main body of the text and clearly delineated; sometimes they are either not provided or outlines of conceptual arguments are given. The intent is to state theorems carefully and to draw clear distinctions between rigorous mathematical arguments and heuristic explanations. When a proof can be given at a mathematical level commensurate with the text and when it enhances conceptual understanding, it is usually provided; in other cases, the effort is to explain subtleties of the definitions and properties concerning random functions, and to state conditions under which a proposition applies. Attention is drawn to the differences between deterministic concepts and their random counterparts, for instance, in the mean-square calculus, orthonormal representation, and linear filtering. Such differences are sometimes glossed over in method books; however, lack of differentiation between random and deterministic analysis can lead to misinterpretation of experimental results and misuse of techniques.

The author's motivation for the book comes from his experience in teaching graduate-level image processing and having to end up teaching random processes. Even students who have taken a course on random processes have often done so in the context of linear operators on signals. This approach is inadequate for image processing. Nonlinear operators play a widening role in image processing, and the spatial nature of imaging makes it significantly different from one-dimensional signal processing. Moreover, students who have some background in stochastic processes often lack a unified view in terms of canonical representation and orthogonal projections in inner product spaces."

"This instructive, wide-ranging book is useful in pursuing ground-breaking research, basic or applied...provides the understanding necessary for sophisticated applications and a better grasp of the underlying processes."

Read More Show Less

Editorial Reviews

Concerns the engineering purposes of the analysis and synthesis if linear and nonlinear systems that operate on spatial and temporal random functions. Because the author's major concern is algorithm development, he focuses on three basic problems of operator synthesis: representation, filter design, and modeling. Chapters cover basic probability, random functions, canonical representation, transform coding, optimal filter design, neural networks, Markov chains, and the theory of random closed sets. Annotation c. by Book News, Inc., Portland, Or.
Read More Show Less

Product Details

  • ISBN-13: 9780819425133
  • Publisher: SPIE Press
  • Publication date: 10/27/1998
  • Series: Press Monographs
  • Edition description: New Edition
  • Edition number: 1
  • Pages: 616
  • Product dimensions: 7.41 (w) x 10.32 (h) x 1.47 (d)

Table of Contents

Chapter 1. Probability Theory
1.1 Probability Space
1.1.1. Events
1.1.2. Conditional Probability
1.2. Random Variables
1.2.1. Probability Distributions
1.2.2. Probability Densities
1.2.3. Functions of a Random Variable
1.3. Moments
1.3.1. Expectation and Variance
1.3.2. Moment-Generating Function
1.4. Important Probability Distributions
1.4.1. Binomial Distribution
1.4.2. Poisson Distribution
1.4.3. Normal Distribution
1.4.4. Gamma Distribution
1.4.5. Beta Distribution
1.4.6. Computer Simulation
1.5. Multivariate Distributions
1.5.1. Jointly Distributed Random Variables
1.5.2. Conditioning
1.5.3. Independence
1.6. Functions of Several Random Variables
1.6.1. Basic Arithmetic Functions of Two Random Variables
1.6.2. Distributions of Sums of Independent Random Variables
1.6.3. Joint Distributions of Output Random Variables
1.6.4. Expectation of a Function of Several Random Variables
1.6.5. Covariance
1.6.6. Multivariate Normal Distribution
1.7. Laws of Large Numbers
1.7.1. Weak Law of Large Numbers
1.7.2. Strong Law of Large Numbers
1.7.3. Central Limit Theorem
1.8. Parametric Estimation via Random Samples
1.8.1. Random-Sample Estimators
1.8.2. Sample Mean and Sample Variance
1.8.3. Minimum-Variance Unbiased Estimators
1.8.4. Method of Moments
1.8.5. Order Statistics
1.9. Maximum-Likelihood Estimation
1.9.1. Maximum-Likelihood Estimators
1.9.2. Additive Noise
1.9.3. Minimum Noise
1.10 Entropy
1.10.1. Uncertainty
1.10.2. InformationEntropy of a Random Vector
1.11 Source Coding
1.11.1 Prefix Codes
1.11.2 Optimal Coding
Exercises for Chapter 1
Chapter 2. Random Processes
2.1. Random Functions
2.2. Moments of a Random Function
2.2.1. Mean and Covariance Functions
2.2.2. Mean and Covariance of a Sum
2.3. Differentiation
2.3.1. Differentiation of Random Functions
2.3.2. Mean-Square Differentiability
2.4. Integration
2.5. Mean Ergodicity
2.6. Poisson Process
2.6.1. One-dimensional Poisson Model
2.6.2. Derivative of the Poisson Process
2.6.3. Properties of Poisson Points
2.6.4. Axiomatic Formulation of Poisson Temporal and Spatial Processes
2.7. Wiener Process and White Noise
2.7.1. White Noise
2.7.2. Random Walk
2.7.3. Wiener Process
2.8. Stationarity
2.8.1. Wide-Sense Stationarity
2.8.2. Mean-Ergodicity for WS Stationary Processes
2.8.3. Covariance-Ergodicity for WS Stationary Processes
2.8.4. Strict-Sense Stationarity
2.9. Estimation
2.10. Linear Systems
2.10.1. Communication of a Linear Operator with Expectation
2.10.2. Representation of Linear Operators
Exercises for Chapter 2
Chapter 3. Canonical Representation
3.1. Canonical Expansions
3.1.1. Fourier Representation and Projections
3.1.2. Canonical Expansion of the Covariance Function
3.2. Karhunen-Loeve Expansion
3.2.1 The Karhunen-Loeve Theorem
3.2.2 Discrete Karhunen-Loeve Expansion
3.2.3 Canonical Expansions with Orthonormal Coordinate Functions
3.2.4 Relation to Data Compression
3.3. Noncanonical Representation
3.3.1 Generalized Beseel Inequality
3.3.2 Decorrelation
3.4. Trigonometric Representation
3.4.1. Trigonometric Fourier Series
3.4.2. Generalized Fourier Coefficients for WS Stationary Processes
3.4.3. Mean-Square Periodic WS Stationary Processes
3.5. Canonical Expansions as Transforms
3.5.1. Orthonormal Transforms of Random Functions
3.5.2. Fourier Descriptors
3.6. Transform Coding
3.6.1. Karhunen-Loeve Compression
3.6.2. Transform Compression Using an Arbitrary Orthonormal System
3.6.3. Walsh-Hadamard Transform
3.6.4. Discrete Cosine Transform
3.6.5. Transform Coding for Digital Images
3.6.6. Optimality of the Karhunen-Loeve Transform
3.7. Coefficients Generated by Linear Functionals
3.7.1. Coefficients From Integral Functionals
3.7.2. Generating Bi-Orthogonal Function Systems
3.8. Canonical Expansion of the Covariance Function
3.8.1. Derivation of a Canonical Expansion from a Covariance Expansion
3.8.2. Constructing a Canonical Expansion for the Covariance Function
3.9. Integral Canonical Expansions
3.9.1. Construction via Integral Functional Coefficients
3.9.2. Construction from a Covariance Expansion
3.10. Power Spectral Density
3.10.1. The Power-Spectral-Density/Covariance Transform Pair
3.10.2. Power Spectral Density and Linear Operators
3.10.3. Integral Canonical Representation of a WS Stationary Random Function
3.11. Canonical Expansions of Vector Random Functions
3.11.1. Vector Random Functions
3.11.2. Canonical Expansions for Vector Random Functions
3.11.3. Finite Sets of Random Vectors
3.12. Canonical Representation over a Discrete Set
Exercises for Chapter 3
Chapter 4. Optimal Filtering
4.1. Optimal Mean-Square-Error Filters
4.1.1. Conditional Expectation
4.1.2. Optimal Nonlinear Filter
4.1.3. Optimal Filter for Jointly Normal Random Variables
4.1.4. Multiple Observation Variables
4.1.5. Bayesian Parametric Estimation
4.2. Optimal Finite-Observation Linear Filters
4.2.1. Linear Filters and the Orthogonality Principle
4.2.2. Design of the Optimal Linear Filter
4.2.3. Optimal Linear Filter in the Jointly Gaussian Case
4.2.4. Role of Wide-Sense Stationarity
4.2.5. Signal-plus-Noise Model
4.2.6. Edge Detection
4.3. Steepest Descent
4.3.1. Steepest Descent Iterative Algorithm
4.3.2. Convergence of the Steepest-Descent Algorithm
4.3.3. Least-Mean-Square Adaptive Algorithm
4.3.4. Convergence of the LMS Algorithm
4.3.5. Nonstationary Processes
4.4. Least-Squares Estimation
4.4.1. Pseudoinverse Estimator
4.4.2. Least-Squares Estimation for Nonwhite Noise
4.4.3. Multiple Linear Regression
4.4.4. Least-Squares Image Restoration
4.5. Optimal Linear Filters for Random Vectors
4.5.1. Optimal Linear Filter for Linearly Dependent Observations
4.5.2. Optimal Estimation of Random Vectors
4.5.3. Optimal Linear Filters for Random Vectors
4.6. Recursive Linear Filters
4.6.1. Recursive Generation of Direct Sums
4.6.2. Static Recursive Optimal Linear Filtering
4.6.3. Dynamic Recursive Optimal Linear Filtering
4.7. Optimal Infinite-Observation Linear Filters
4.7.1. Wiener-Hopf Equation
4.7.2. Wiener Filter
4.8. Optimal Linear Filters in the Context of a Linear Model
4.8.1. The Linear Signal Model
4.8.2. Procedure for Finding the Optimal Linear Filter
4.8.3. Additive White Noise
4.8.4. Discrete Domains
4.9. Optimal Linear Filters via Canonical Expansions
4.9.1. Integral Decomposition into White Noise
4.9.2. Integral Equations Involving the Autocorrelation Function
4.9.3. Solution Via Discrete Canonical Expansions
4.10. Optimal Binary Filters
4.10.1. Binary Conditional Expectation
4.10.2. Boolean Functions and Optimal Translation-Invariant Filters
4.10.3. Optimal Increasing Filters
4.11. Pattern Classification
4.11.1. Optimal Classifiers
4.11.2. Gaussian Maximum-Likelihood Classification
4.11.3. Linear Discriminants
4.12. Neural Networks
4.12.1. Two-Layer Neural Networks
4.12.2. Steepest Descent for Nonquadratic Error Surfaces
4.12.3. Sum-of-Squares Error
4.12.4. Error Back-Propagation
4.12.5. Error Back-Propagation for Multiple Outputs
4.12.6. Adaptive Network Design
Exercises for Chapter 4
Chapter 5. Random Models
5.1. Markov Chains
5.1.1. Chapman-Kolmogorov Equations
5.1.2. Transition Probability Matrix
5.1.3. Markov Processes
5.2. Steady-State Distributions for Discrete-Time Markov Chains
5.2.1. Long-Run Behavior of a Two-State Markov Chain
5.2.2. Classification of States
5.2.3. Steady-State and Stationary Distributions
5.2.4. Long-Run Behavior of Finite Markov Chains
5.2.5. Long-Run Behavior of Markov Chains with Infinite State Spaces
5.3. Steady-State Distributions for Continuous-Time Markov Chains
5.3.1. Steady-State Distribution of an Irreducible Continuous-Time Markov Chain
5.3.2. Birth-Death Model Queues
5.3.3. Forward and Backward Kolmogorov Equations
5.4. Markov Random Fields
5.4.1. Neighborhood Systems
5.4.2. Determination by Conditional Probabilities
5.4.3. Gibbs Distributions
5.5. Random Boolean Model
5.5.1. Germ-Grain Model
5.5.2. Vacancy
5.5.3. Hitting
5.5.4. Linear Boolean Model
5.6. Granulometries
5.6.1. Openings
5.6.2. Classification by Granulometric Moments
5.6.3. Adaptive Reconstructive Openings
5.7. Random Sets
5.7.1. Hit-or-Miss Topology
5.7.2. Convergence and Continuity
5.7.3. RACS
5.7.4. Capacity Functional
Exercises for Chapter 5
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star


4 Star


3 Star


2 Star


1 Star


Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation


  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)