 Shopping Bag ( 0 items )

All (8) from $84.61

New (5) from $84.61

Used (3) from $88.69
More About This Textbook
Overview
Techniques for the analysis of texture in digital images are essential to a range of applications in areas as diverse as robotics, defence, medicine and the geosciences. In biological vision, texture is an important cue allowing humans to discriminate objects. This is because the brain is able to decipher important variations in data at scales smaller than those of the viewed objects. In order to deal with texture in digital data, many techniques have been developed by image processing researchers. With a wholly practical approach and many worked examples, Image Processing: Dealing with Texture is a comprehensive guide to these techniques, including chapters on mathematical morphology, fractals, Markov random fields, Gabor functions and wavelets. Structured around a series of questions and answers, enabling readers to easily locate information on specific problems, this book also: provides detailed descriptions of methods used to analyse binary as well as grey texture images presents information on two levels: an easytofollow narrative explaining the basics, and an advanced, indepth study of mathematical theorems and concepts looks at 'good' and 'bad' image processing practice, with wrongly designed algorithms illustrating 'what not to do' includes accompanying website, setting out all algorithms discussed within the text. An ideal selfteaching aid for senior undergraduate and Masters students taking courses in image processing and pattern recognition, this book is also an ideal reference for PhD students, electrical and biomedical engineers, mathematicians, and informatics researchers designing image processing applications.
Product Details
Related Subjects
Table of Contents
Preface.
1 Introduction.
What is texture?
Why are we interested in texture?
How do we cope with texture when texture is a nuisance?
How does texture give us information about the material of theimaged object?
Are there nonoptical images?
What is the meaning of texture in nonoptical images?
What is the albedo of a surface?
Can a surface with variable albedo appear nontextured?
Can a rough surface of uniform albedo appear nontextured?
What are the problems of texture which image processing istrying to solve?
What are the limitations of image processing in trying to solvethe above problems?
How may the limitations of image processing be overcome forrecognising textures?
What is this book about?
Box 1.1. An algorithm for the isolation of textured regions.
2 Binary textures.
Why are we interested in binary textures?
What is this chapter about?
Are there any generic tools appropriate for all types oftexture?
Can we at least distinguish classes of texture?
Which are the texture classes?
Which tools are appropriate for each type of texture?
2.1 Shape grammars.
What is a shape grammar?
Box 2.1. Shape grammars.
What happens if the placement of the primitive pattern is notregular?
What happens if the primitive pattern itself is not always thesame?
What happens if the primitive patterns vary in a continuousway?
2.2 Boolean models.
What is a 2D Boolean model?
Box 2.2. How can we draw random numbers according to a givenprobability density function?
Box 2.3. What is a Poisson process?
How can we use the 2D Boolean model to describe a binarytexture?
How can we estimate some aggregate parameters of the 2D Booleanmodel?
How can we estimate some individual parameters of the 2D Booleanmodel?
Box 2.4. How can we relate the individual parameters to theaggregate parameters of the 2D Boolean model?
What is the simplest possible primitive pattern we may have in aBoolean model?
What is a 1D Boolean model?
How may the 1D Boolean model be used to describe textures?
How can we create 1D strings from a 2D image?
Box 2.5. Hilbert curves.
How can we estimate the parameters of the 1D Boolean model?
Box 2.6. Parameter estimation for the discrete 1D Booleanmodel.
What happens if the primitive patterns are very irregular?
2.3 Mathematical morphology.
What is mathematical morphology ?
What is dilation?
What is erosion?
Is there any way to lose details smaller than a certain size butleave the size of larger details unaffected?
What is opening?
What is closing?
How do we do morphological operations if the structuring elementis not symmetric about its centre?
Since the structuring element looks like a small image, can weexchange the roles of object and structuring element?
Is closing a commutative operation?
Can we use different structuring elements for the erosion andthe dilation parts of the opening and closing operators?
Can we apply morphological operators to the white pixels of animage instead of applying them to the black pixels?
Can we apply more than one morphological operator to the sameimage?
Is erosion an associative operation as well?
How can we use morphological operations to characterise atexture?
Box 2.7. Formal definitions inmathematical morphology.
What is the “take home” message of this chapter?
3 Stationary grey texture images 81
What is a stationary texture image?
What is this chapter about?
Are any of the methods appropriate for classifying binarytextures useful for the analysis of grey textures?
3.1 Image binarisation.
How may a grey image be analysed into a set of binary images bythresholding?
How may a grey image be analysed into a set of binary images bybitslicing?
Is there any relationship between the binary planes produced bythresholding and the bit planes?
3.2 Grey scale mathematical morphology.
How does mathematica lmorphology generalise for grey images?
How is the complement of an image defined for grey images?
What is a nonflat structuring element?
What is the relationship between the morphological operationsapplied to an image and those applied to its complement?
What is the purpose of using a nonflat structuring element?
How can we perform granulometry with a grey image?
Can we extract in one go the details of a signal, peaks orvalleys, smaller than a certain size?
How can we use the pattern spectrum to classify textures?
3.3 Fractals.
What is a fractal?
What is the fractal dimension?
Which statistical properties remain the same at all scales innondeterministic fractals?
Box 3.1. What is selfaffine scaling?
Box 3.2. What is the relationship between the fractal dimensionand exponent H?
Box 3.3. What is the range of values of H?
What is a fractional Brownian motion?
Box 3.4. Prove that the range of values of H for afractional Brownian motion is (0,1)
Box 3.5. What is the correlation between two increments of afractional Brownian motion?
Box 3.6. What is the power spectrum of a fractal?
Box 3.7. Robust line fitting using the Ransac method.
Box 3.8. What is the autocorrelation function of a fractal?
Is fractal dimension a good texture descriptor?
Is there a way to enrich the description of textures offered byfractal models?
What is lacunarity?
3.4 Markov random fields.
What is a Markov random field?
Which are the neighbouring pixels of a pixel?
How can we use MRFs to characterise textures?
What is texture synthesis by analysis?
How can we apply the Markov model to create textures?
Can we apply the method discussed in the previous section tocreate images with 256 grey levels?
What is the autonormal Markov random field model?
How can we estimate the Markov parameters of a texture?
What is maximum likelihood estimation?
What is the loglikelihood?
Box 3.9. What is the relationship between maximum likelihoodestimation and Bayesian estimation?
How can we apply maximum likelihood estimation to estimate theparameters of a Markov random field?
How do we know which parameter values to try when we apply MLEto estimate the Markov parameters?
How can we estimate the Markov parameters with the least squareerror estimation method?
Box 3.10. Least square parameter estimation for the MRFparameters.
Is a Markov random field always realisable given that we defineit arbitrarily?
What conditions make an MRF selfconsistent?
What is a clique in a neighbourhood structure?
3.5 Gibbs distributions.
What is a Gibbs distribution?
What is a clique potential?
Can we have a Markov random field with only singletoncliques?
What is the relationship between the clique potentials and theMarkov parameters?
Box 3.11. Prove the equivalence of Markov random fields andGibbs distributions (Hammersley–Clifford theorem).
How can we use the Gibbs distribution to create textures?
How can we create an image compatible with a Gibbs model if weare not interested in fixing the histogram of the image?
What is the temperature of a Gibbs distribution?
How does the temperature parameter of the Gibbs distributiondetermine how distinguishable one configuration is fromanother?
What is the critical temperature of a Markov random field?
3.6 The autocorrelation function as a texturedescriptor.
How can we compute the autocorrelation function of an MRF?
Can we use the autocorrelation function itself to characterise atexture?
How can we use the autocorrelation function directly for texturecharacterisation?
How can we infer the periodicity of a texture from theautocorrelation function?
How can we extract parametric features from the autocorrelationfunction? .
Box 3.12. Least square fitting in 2D and 1D.
3.7 Texture features from the Fourier transform.
Can we infer the periodicity of a texture directly from itspower spectrum?
Does the phase of the Fourier transform convey any usefulinformation?
Since the phase conveys more information for a pattern than itspower spectrum, why don’t we use the phase to describetextures?
Is it possible to compute from the image phase a function thevalue of which changes only due to genuine image changes?
How do we perform phase unwrapping?
What are the drawbacks of the simple phase unwrappingalgorithm?
3.8 Cooccurrence matrices.
Can we use nonparametric descriptions of texture?
How is a cooccurrence matrix defined?
How do we compute the cooccurrence matrix in practice?
How can we recognise textures with the help of the cooccurrencematrix?
How can we choose the parameters of the cooccurrencematrix?
What are the higherorder cooccurrence matrices?
What is the “take home” message of this chapter?
4 Nonstationary grey texture images.
What is a nonstationary texture image?
What is this chapter about?
Why can’t we use the methods developed in the previouschapter here?
How can we be sure that the texture inside an image window isstationary?
4.1 The uncertainty principle and its implications in signaland image processing.
What is the uncertainty principle in signal processing?
Box 4.1. Prove the uncertainty principle in signalprocessing.
Does the window we choose in order to extract local informationinfluence the result?
How can we estimate “what is happening where” in adigital signal?
How can we deal with the variability of the values of afeature?
How do we know which size window we should use?
How is the uncertainty principle generalised to 2D?
4.2 Gabor functions.
What is a Gabor function?
Why are Gabor functions useful in analysing a signal?
How can we use the Gabor functions in practice?
How is a Gabor function generalised in 2D?
How may we use the 2D Gabor functions to analyse an image?
Can we have alternative tessellations of the frequencydomain?
How can we define a Gaussian window in polar coordinates in thefrequency domain?
What is an octave?
How do we express a frequency in octaves?
How may we choose the parameters of the Gaussian window in thefrequency space?
4.3 Prolate spheroidal sequence functions.
Is it possible to have a window with sharp edges in one domainwhich has minimal side ripples in the other domain?
Box 4.2. Of all the bandlimited sequences one can define, whichsequence has the maximum energy concentration between a given setof indices?
Box 4.3. Do prolate spheroidal wave functions exists in thedigital domain?
What is the relationship of two bandlimited functions, theFourier transforms of which are given by the real functionsF(ωx, ωy), andF(−ωx,−ωy),respectively?
How can we construct a filter which is bandlimited in two bandswhich are symmetrically
placed about the origin of the axes in the frequency domain?
Box 4.4. How may we generalise the prolate spheroidal sequencefunctions to 2D?
Could we construct the 2D prolate spheroidal sequence filters asseparable filters?
What is the advantage of using separable filters?
4.4 Wavelets.
Is there a way other than using Gabor functions to span thewhole spatiofrequency space?
What is a wavelet?
How can we use wavelets to analyse a signal?
Box 4.5. How should we choose the mother wavelet?
Box 4.6. Does the wavelet function minimise the uncertaintyinequality?
How is the wavelet transform adapted for digital signals?
How do we compute the wavelet coefficients in practice?
Why is the continuous wavelet transform invertible and thediscrete wavelet transform noninvertible?
How can we span the part of the “what happens when”space which contains the direct component of the signal?
Can we span the whole “what is where” space by usingonly the scaling function?
What is a Laplacian pyramid?
Why is the creation of a Laplacian pyramid associated with theapplication of a Gaussian function at different scales, and thesubtraction of the results?
Why may the second derivative of a Gaussian function be used asa filter to estimate the second derivative of a signal?
How can we extract the coarse resolution content of a signalfrom its content at a finer resolution?
How can we choose the scaling function.
How do we perform the multiresolution analysis of a signal inpractice?
Why in tree wavelet analysis do we always analyse the part ofthe signal which
contains the low frequencies only?
Box 4.7. How do we recover the original signal from its waveletcoefficients in practice?
How many different wavelet filters exist?
How may we use wavelets to process images?
How may we use wavelets to construct texture features?
What is the maximum overlap algorithm?
What is the relationship between Gabor functions andwavelets?
4.5 Where Image Processing and Pattern Recognitionmeet.
Why in wavelet analysis do we always split the band with themaximum energy?
What is feature selection?
How can we visualise the histogram of more than one feature inorder to decide whether they constitute a good feature set?
What is the feature space?
What is the histogram of distances in a feature space?
Is it possible that the histogram of distances does not pick upthe presence of clusters, even though clusters are present?
How do we segment the image once we have produced a set offeatures for each pixel?
What is the Kmeans algorithm?
What is deterministic annealing?
Box 4.8. Maximum entropy clustering.
How may we assess the quality of a segmentation?
How is the Bhattacharyya distance defined?
How can we compute the Bhattacharyya distance in practice?
How may we assess the quality of a segmentation using a manualsegmentation as reference?
What is a confusion matrix?
What are the over and underdetection errors?
4.6 Laws’ masks and the ‘‘what looks likewhere’’ space.
Is it possible to extract image features without referring tothe frequency domain?
How are Laws’ masks defined?
Is there a systematic way to construct features that span the“what looks like where” space completely?
How can we expand a local image neighbourhood in terms of theWalsh elementary images?
Can we use convolution to compute the coefficients of theexpansion of a subimage in terms of a set of elementaryimages?
Is there any other way to express the local structure of theimage?
4.7 Local binary patterns.
What is the local binary pattern approach to texturerepresentation?
How can we make this representation rotationally invariant?
How can we make this representation appropriate formacrotextures?
How can we use the local binary patterns to characterisetextures?
What is a metric?
What is a pseudometric?
Why should one wish to use a pseudometric and not a metric?
How can we measure the difference between two histograms?
How can we use the local binary patterns to segmenttextures?
How can we overcome the shortcomings of the LBPsegmentation?
4.8 The Wigner distribution.
What is the Wigner distribution?
How is the Wigner distribution used for the analysis of digitalsignals?
What is the pseudoWigner distribution?
What is the Kaiser window?
What is the Nyquist frequency?
Why does the use of the pseudoWigner distribution requiresignals which have been sampled at twice their Nyquistfrequency?
Should we worry about aliasing when we use the pseudoWignerdistribution for texture analysis?
How is the pseudoWigner distribution defined for the analysisof images?
How can the pseudoWigner distribution be used for texturesegmentation?
What is the “takehome” message of this chapter?
Bibliographical notes.
References.
Index.