Signal Analysis: Time, Frequency, Scale, and Structure / Edition 1

Signal Analysis: Time, Frequency, Scale, and Structure / Edition 1

by Ronald L. Allen, Duncan Mills
ISBN-10:
0471234419
ISBN-13:
9780471234418
Pub. Date:
01/02/2004
Publisher:
Wiley
ISBN-10:
0471234419
ISBN-13:
9780471234418
Pub. Date:
01/02/2004
Publisher:
Wiley
Signal Analysis: Time, Frequency, Scale, and Structure / Edition 1

Signal Analysis: Time, Frequency, Scale, and Structure / Edition 1

by Ronald L. Allen, Duncan Mills

Hardcover

$184.95
Current price is , Original price is $184.95. You
$184.95 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.


Overview

  • Offers a well-rounded, mathematical approach to problems in signal interpretation using the latest time, frequency, and mixed-domain methods
  • Equally useful as a reference, an up-to-date review, a learning tool, and a resource for signal analysis techniques
  • Provides a gradual introduction to the mathematics so that the less mathematically adept reader will not be overwhelmed with instant hard analysis
  • Covers Hilbert spaces, complex analysis, distributions, random signals, analog Fourier transforms, and more

Product Details

ISBN-13: 9780471234418
Publisher: Wiley
Publication date: 01/02/2004
Pages: 960
Product dimensions: 6.30(w) x 9.50(h) x 1.95(d)

About the Author

RONALD L. ALLEN received his BA in mathematics from the University of California, Berkeley in 1973, his MA in mathematics from the University of California, Los Angeles in 1975, and his MS and PhD in Computer Science from the University of Texas at Arlington in 1990 and 1993, respectively.

DUNCAN W. MILLS received his BA in Physics from Wesleyan University, his MS in Electrical Engineering from George Washington University, and his PhD in Electrical Engineering from University of Texas at Dallas in 1992.

Read an Excerpt

Signal Analysis

Time, Frequency, Scale, and Structure
By Ronald L. Allen Duncan W. Mills

John Wiley & Sons

Copyright © 2004 Institute of Electrical and Electronics Engineers, Inc.
All right reserved.

ISBN: 0-471-23441-9


Chapter One

Signals: Analog, Discrete, and Digital

Analog, discrete, and digital signals are the raw material of signal processing and analysis. Natural processes, whether dependent upon or independent of human control, generate analog signals; they occur in a continuous fashion over an interval of time or space. The mathematical model of an analog signal is a function defined over a part of the real number line. Analog signal conditioning uses conventional electronic circuitry to acquire, amplify, filter, and transmit these signals. At some point, digital processing may take place; today, this is almost always necessary. Perhaps the application requires superior noise immunity. Intricate processing steps are also easier to implement on digital computers. Furthermore, it is easier to improve and correct computerized algorithms than systems comprised of hard-wired analog components. Whatever the rationale for digital processing, the analog signal is captured, stored momentarily, and then converted to digital form. In contrast to an analog signal, a discrete signal has values only at isolated points. Its mathematical representation is a function on theintegers; this is a fundamental difference. When the signal values are of finite precision, so that they can be stored in the registers of a computer, then the discrete signal is more precisely known as a digital signal. Digital signals thus come from sampling an analog signal, and-although there is such a thing as an analog computer-nowadays digital machines perform almost all analytical computations on discrete signal data.

This has not, of course, always been the case; only recently have discrete techniques come to dominate signal processing. The reasons for this are both theoretical and practical.

On the practical side, nineteenth century inventions for transmitting words, the telegraph and the telephone-written and spoken language, respectively-mark the beginnings of engineered signal generation and interpretation technologies. Mathematics that supports signal processing began long ago, of course. But only in the nineteenth century did signal theory begin to distinguish itself as a technical, engineering, and scientific pursuit separate from pure mathematics. Until then, scientists did not see mathematical entities-polynomials, sinusoids, and exponential functions, for example-as sequences of symbols or carriers of information. They were envisioned instead as ideal shapes, motions, patterns, or models of natural processes.

The development of electromagnetic theory and the growth of electrical and electronic communications technologies began to divide these sciences. The functions of mathematics came to be studied as bearing information, requiring modification to be useful, suitable for interpretation, and having a meaning. The life story of this new discipline-signal processing, communications, signal analysis, and information theory-would follow a curious and ironic path. Electromagnetic waves consist of coupled electric and magnetic fields that oscillate in a sinusoidal pattern and are perpendicular to one another and to their direction of propagation. Fourier discovered that very general classes of functions, even those containing discontinuities, could be represented by sums of sinusoidal functions, now called a Fourier series. This surprising insight, together with the great advances in analog communication methods at the beginning of the twentieth century, captured the most attention from scientists and engineers.

Research efforts into discrete techniques were producing important results, even as the analog age of signal processing and communication technology charged ahead. Discrete Fourier series calculations were widely understood, but seldom carried out; they demanded quite a bit of labor with pencil and paper. The first theoretical links between analog and discrete signals were found in the 1920s by Nyquist, in the course of research on optimal telegraphic transmission mechanisms. Shannon built upon Nyquist's discovery with his famous sampling theorem. He also proved something to be feasible that no one else even thought possible: error-free digital communication over noisy channels. Soon thereafter, in the late 1940s, digital computers began to appear. These early monsters were capable of performing signal processing operations, but their speed remained too slow for some of the most important computations in signal processing-the discrete versions of the Fourier series. All this changed two decades later when Cooley and Tukey disclosed their fast Fourier transform (FFT) algorithm to an eager computing public. Digital computations of Fourier's series were now practical on real-time signal data, and in the following years digital methods would proliferate. At the present time, digital systems have supplanted much analog circuitry, and they are the core of almost all signal processing and analysis systems. Analog techniques handle only the early signal input, output, and conditioning chores.

There are a variety of texts available covering signal processing. Modern introductory systems and signal processing texts cover both analog and discrete theory. Many reflect the shift to discrete methods that began with the discovery of the FFT and was fueled by the ever-increasing power of computing machines. These often concentrate on discrete techniques and presuppose a background in analog signal processing. Again, there is a distinction between discrete and digital signals. Discrete signals are theoretical entities, derived by taking instantaneous-and therefore exact-samples from analog signals. They might assume irrational values at some time instants, and the range of their values might be infinite. Hence, a digital computer, whose memory elements only hold limited precision values, can only process those discrete signals whose values are finite in number and finite in their precision-digital signals. Early texts on discrete signal processing sometimes blurred the distinction between the two types of signals, though some further editions have adopted the more precise terminology. Noteworthy, however, are the burgeoning applications of digital signal processing integrated circuits: digital telephony, modems, mobile radio, digital control systems, and digital video to name a few. The first high-definition television (HDTV) systems were analog; but later, superior HDTV technologies have relied upon digital techniques. This technology has created a true digital signal processing literature, comprised of the technical manuals for various DSP chips, their application notes, and general treatments on fast algorithms for real-time signal processing and analysis applications on digital signal processors. Some of our later examples and applications offer some observations on architectures appropriate for signal processing, special instruction sets, and fast algorithms suitable for DSP implementation.

This chapter introduces signals and the mathematical tools needed to work with them. Everyone should review this chapter's first six sections. This first chapter combines discussions of analog signals, discrete signals, digital signals, and the methods to transition from one of these realms to another. All that it requires of the reader is a familiarity with calculus. There are a wide variety of examples. They illustrate basic signal concepts, filtering methods, and some easily understood, albeit limited, techniques for signal interpretation. The first section introduces the terminology of signal processing, the conventional architecture of signal processing systems, and the notions of analog, discrete, and digital signals. It describes signals in terms of mathematical models-functions of a single real or integral variable. A specification of a sequence of numerical values ordered by time or some other spatial dimension is a time domain description of a signal. There are other approaches to signal description: the frequency and scale domains, as well as some-relatively recent-methods for combining them with the time domain description. Sections 1.2 and 1.3 cover the two basic signal families: analog and discrete, respectively. Many of the signals used as examples come from conventional algebra and analysis.

The discussion gets progressively more formal. Section 1.4 covers sampling and interpolation. Sampling picks a discrete signal from an analog source, and interpolation works the other way, restoring the gaps between discrete samples to fashion an analog signal from a discrete signal. By way of these operations, signals pass from the analog world into the discrete world and vice versa. Section 1.5 covers periodicity, and foremost among these signals is the class of sinusoids. These signals are the fundamental tools for constructing a frequency domain description of a signal. There are many special classes of signals that we need to consider, and Section 1.6 quickly collects them and discusses their properties. We will of course expand upon and deepen our understanding of these special types of signals throughout the book. Readers with signal processing backgrounds may quickly scan this material; however, those with little prior work in this area might well linger over these parts.

The last two sections cover some of the mathematics that arises in the detailed study of signals. The complex number system is essential for characterizing the timing relationships in signals and their frequency content. Section 1.7 explains why complex numbers are useful for signal processing and exposes some of their unique properties. Random signals are described in Section 1.8. Their application is to model the unpredictability in natural signals, both analog and discrete. Readers with a strong mathematics background may wish to skim the chapter for the special signal processing terminology and skip Sections 1.7 and 1.8. These sections can also be omitted from a first reading of the text.

A summary, a list of references, and a problem set complete the chapter. The summary provides supplemental historical notes. It also identifies some software resources and publicly available data sets. The references point out other introductory texts, reviews, and surveys from periodicals, as well as some of the recent research.

1.1 INTRODUCTION TO SIGNALS

There are several standpoints from which to study signal analysis problems: empirical, technical, and theoretical. This chapter uses all of them. We present lots of examples, and we will return to them often as we continue to develop methods for their processing and interpretation. After practical applications of signal processing and analysis, we introduce some basic terminology, goals, and strategies.

Our early methods will be largely experimental. It will be often be difficult to decide upon the best approach in an application; this is the limitation of an intuitive approach. But there will also be opportunities for making technical observations about the right mathematical tool or technique when engaged in a practical signal analysis problem. Mathematical tools for describing signals and their characteristics will continue to illuminate this technical side to our work. Finally, some abstract considerations will arise at the end of the chapter when we consider complex numbers and random signal theory. Right now, however, we seek only to spotlight some practical and technical issues related to signal processing and analysis applications. This will provide the motivation for building a significant theoretical apparatus in the sequel.

1.1.1 Basic Concepts

Signals are symbols or values that appear in some order, and they are familiar entities from science, technology, society, and life. Examples fit easily into these categories: radio-frequency emissions from a distant quasar; telegraph, telephone, and television transmissions; people speaking to one another, using hand gestures; raising a sequence of flags upon a ship's mast; the echolocation chirp of animals such as bats and dolphins; nerve impulses to muscles; and the sensation of light patterns striking the eye. Some of these signal values are quantifiable; the phenomenon is a measurable quantity, and its evolution is ordered by time or distance. Thus, a residential telephone signal's value is known by measuring the voltage across the pair of wires that comprise the circuit. Sound waves are longitudinal and produce minute, but measurable, pressure variations on a listener's eardrum. On the other hand, some signals appear to have a representation that is at root not quantifiable, but rather symbolic. Thus, most people would grant that sign language gestures, maritime signal flags, and even ASCII text could be considered signals, albeit of a symbolic nature.

Let us for the moment concentrate on signals with quantifiable values. These are the traditional mathematical signal models, and a rich mathematical theory is available for studying them. We will consider signals that assume symbolic values, too, but, unlike signals with quantifiable values, these entities are better described by relational mathematical structures, such as graphs.

Now, if the signal is a continuously occurring phenomenon, then we can represent it as a function of a time variable t; thus, x(t) is the value of signal x at time t. We understand the units of measurement of x(t) implicitly. The signal might vary with some other spatial dimension other than time, but in any case, we can suppose that its domain is a subset of the real numbers. We then say that x(t) is an analog signal. Analog signal values are read from conventional indicating devices or scientific instruments, such as oscilloscopes, dial gauges, strip charts, and so forth.

An example of an analog signal is the seismogram, which records the shaking motion of the ground during an earthquake. A precision instrument, called a seismograph, measures ground displacements on the order of a micron ([10.sup.-6] m) and produces the seismogram on a paper strip chart attached to a rotating drum. Figure 1.1 shows the record of the Loma Prieta earthquake, centered in the Santa Cruz mountains of northern California, which struck the San Francisco Bay area on 18 October 1989.

Seismologists analyze such a signal in several ways. The total deflection of the pen across the chart is useful in determining the temblor's magnitude. Seismograms register three important types of waves: the primary, or P waves; the secondary, or S waves; and the surface waves. P waves arrive first, and they are compressive, so their direction of motion aligns with the wave front propagation. The transverse S waves follow. They oscillate perpendicular to the direction of propagation. Finally, the large, sweeping surface waves appear on the trace.

This simple example illustrates processing and analysis concepts.

Continues...


Excerpted from Signal Analysis by Ronald L. Allen Duncan W. Mills Copyright © 2004 by Institute of Electrical and Electronics Engineers, Inc.. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Preface.

Acknowledgments.

1 Signals: Analog, Discrete, and Digital.

1.1 Introduction to Signals.

1.1.1 Basic Concepts.

1.1.2 Time-Domain Description of Signals.

1.1.3 Analysis in the Time-Frequency Plane.

1.1.4 Other Domains: Frequency and Scale.

1.2 Analog Signals.

1.2.1 Definitions and Notation.

1.2.2 Examples.

1.2.3 Special Analog Signals.

1.3 Discrete Signals.

1.3.1 Definitions and Notation.

1.3.2 Examples.

1.3.3 Special Discrete Signals.

1.4 Sampling and Interpolation.

1.4.1 Introduction.

1.4.2 Sampling Sinusoidal Signals.

1.4.3 Interpolation.

1.4.4 Cubic Splines.

1.5 Periodic Signals.

1.5.1 Fundamental Period and Frequency.

1.5.2 Discrete Signal Frequency.

1.5.3 Frequency Domain.

1.5.4 Time and Frequency Combined.

1.6 Special Signal Classes.

1.6.1 Basic Classes.

1.6.2 Summable and Integrable Signals.

1.6.3 Finite Energy Signals.

1.6.4 Scale Description.

1.6.5 Scale and Structure.

1.7 Signals and Complex Numbers.

1.7.1 Introduction.

1.7.2 Analytic Functions.

1.7.3 Complex Integration.

1.8 Random Signals and Noise.

1.8.1 Probability Theory.

1.8.2 Random Variables.

1.8.3 Random Signals.

1.9 Summary.

1.9.1 Historical Notes.

1.9.2 Resources.

1.9.3 Looking Forward.

1.9.4 Guide to Problems.

References.

Problems.

2 Discrete Systems and Signal Spaces.

2.1 Operations on Signals.

2.1.1 Operations on Signals and Discrete Systems.

2.1.2 Operations on Systems.

2.1.3 Types of Systems.

2.2 Linear Systems.

2.2.1 Properties.

2.2.2 Decomposition.

2.3 Translation Invariant Systems.

2.4 Convolutional Systems.

2.4.1 Linear, Translation-Invariant Systems.

2.4.2 Systems Defined by Difference Equations.

2.4.3 Convolution Properties.

2.4.4 Application: Echo Cancellation in Digital Telephony.

2.5 The lp Signal Spaces.

2.5.1 lp Signals.

2.5.2 Stable Systems.

2.5.3 Toward Abstract Signal Spaces.

2.5.4 Normed Spaces.

2.5.5 Banach Spaces.

2.6 Inner Product Spaces.

2.6.1 Definitions and Examples.

2.6.2 Norm and Metric.

2.6.3 Orthogonality.

2.7 Hilbert Spaces.

2.7.1 Definitions and Examples.

2.7.2 Decomposition and Direct Sums.

2.7.3 Orthonormal Bases.

2.8 Summary.

References.

Problems.

3 Analog Systems and Signal Spaces.

3.1 Analog Systems.

3.1.1 Operations on Analog Signals.

3.1.2 Extensions to the Analog World.

3.1.3 Cross-Correlation, Autocorrelation, and Convolution.

3.1.4 Miscellaneous Operations.

3.2 Convolution and Analog LTI Systems.

3.2.1 Linearity and Translation-Invariance.

3.2.2 LTI Systems, Impulse Response, and Convolution.

3.2.3 Convolution Properties.

3.2.4 Dirac Delta Properties.

3.2.5 Splines.

3.3 Analog Signal Spaces.

3.3.1 Lp Spaces.

3.3.2 Inner Product and Hilbert Spaces.

3.3.3 Orthonormal Bases.

3.3.4 Frames.

3.4 Modern Integration Theory.

3.4.1 Measure Theory.

3.4.2 Lebesgue Integration.

3.5 Distributions.

3.5.1 From Function to Functional.

3.5.2 From Functional to Distribution.

3.5.3 The Dirac Delta.

3.5.4 Distributions and Convolution.

3.5.5 Distributions as a Limit of a Sequence.

3.6 Summary.

3.6.1 Historical Notes.

3.6.2 Looking Forward.

3.6.3 Guide to Problems.

References.

Problems.

4 Time-Domain Signal Analysis.

4.1 Segmentation.

4.1.1 Basic Concepts.

4.1.2 Examples.

4.1.3 Classification.

4.1.4 Region Merging and Splitting.

4.2 Thresholding.

4.2.1 Global Methods.

4.2.2 Histograms.

4.2.3 Optimal Thresholding.

4.2.4 Local Thresholding.

4.3 Texture.

4.3.1 Statistical Measures.

4.3.2 Spectral Methods.

4.3.3 Structural Approaches.

4.4 Filtering and Enhancement.

4.4.1 Convolutional Smoothing.

4.4.2 Optimal Filtering.

4.4.3 Nonlinear Filters.

4.5 Edge Detection.

4.5.1 Edge Detection on a Simple Step Edge.

4.5.2 Signal Derivatives and Edges.

4.5.3 Conditions for Optimality.

4.5.4 Retrospective.

4.6 Pattern Detection.

4.6.1 Signal Correlation.

4.6.2 Structural Pattern Recognition.

4.6.3 Statistical Pattern Recognition.

4.7 Scale Space.

4.7.1 Signal Shape, Concavity, and Scale.

4.7.2 Gaussian Smoothing.

4.8 Summary.

References.

Problems.

5 Fourier Transforms of Analog Signals.

5.1 Fourier Series.

5.1.1 Exponential Fourier Series.

5.1.2 Fourier Series Convergence.

5.1.3 Trigonometric Fourier Series.

5.2 Fourier Transform.

5.2.1 Motivation and Definition.

5.2.2 Inverse Fourier Transform.

5.2.3 Properties.

5.2.4 Symmetry Properties.

5.3 Extension to L2(R).

5.3.1 Fourier Transforms in L1(R) ∩   L2(R).

5.3.2 Definition.

5.3.3 Isometry.

5.4 Summary.

5.4.1 Historical Notes.

5.4.2 Looking Forward.

References.

Problems.

6 Generalized Fourier Transforms of Analog Signals.

6.1 Distribution Theory and Fourier Transforms.

6.1.1 Examples.

6.1.2 The Generalized Inverse Fourier Transform.

6.1.3 Generalized Transform Properties.

6.2 Generalized Functions and Fourier Series Coefficients.

6.2.1 Dirac Comb: A Fourier Series Expansion.

6.2.2 Evaluating the Fourier Coefficients: Examples.

6.3 Linear Systems in the Frequency Domain.

6.3.1 Convolution Theorem.

6.3.2 Modulation Theorem.

6.4 Introduction to Filters.

6.4.1 Ideal Low-pass Filter.

6.4.2 Ideal High-pass Filter.

6.4.3 Ideal Bandpass Filter.

6.5 Modulation.

6.5.1 Frequency Translation and Amplitude Modulation.

6.5.2 Baseband Signal Recovery.

6.5.3 Angle Modulation.

6.6 Summary.

References.

Problems.

7 Discrete Fourier Transforms.

7.1 Discrete Fourier Transform.

7.1.1 Introduction.

7.1.2 The DFT’s Analog Frequency-Domain Roots.

7.1.3 Properties.

7.1.4 Fast Fourier Transform.

7.2 Discrete-Time Fourier Transform.

7.2.1 Introduction.

7.2.2 Properties.

7.2.3 LTI Systems and the DTFT.

7.3 The Sampling Theorem.

7.3.1 Band-Limited Signals.

7.3.2 Recovering Analog Signals from Their Samples.

7.3.3 Reconstruction.

7.3.4 Uncertainty Principle.

7.4 Summary.

References.

Problems.

8 The z-Transform.

8.1 Conceptual Foundations.

8.1.1 Definition and Basic Examples.

8.1.2 Existence.

8.1.3 Properties.

8.2 Inversion Methods.

8.2.1 Contour Integration.

8.2.2 Direct Laurent Series Computation.

8.2.3 Properties and z-Transform Table Lookup.

8.2.4 Application: Systems Governed by Difference Equations.

8.3 Related Transforms.

8.3.1 Chirp z-Transform.

8.3.2 Zak Transform.

8.4 Summary.

8.4.1 Historical Notes.

8.4.2 Guide to Problems.

References.

Problems.

9 Frequency-Domain Signal Analysis.

9.1 Narrowband Signal Analysis.

9.1.1 Single Oscillatory Component: Sinusoidal Signals.

9.1.2 Application: Digital Telephony DTMF.

9.1.3 Filter Frequency Response.

9.1.4 Delay.

9.2 Frequency and Phase Estimation.

9.2.1 Windowing.

9.2.2 Windowing Methods.

9.2.3 Power Spectrum Estimation.

9.2.4 Application: Interferometry.

9.3 Discrete filter design and implementation.

9.3.1 Ideal Filters.

9.3.2 Design Using Window Functions.

9.3.3 Approximation.

9.3.4 Z-Transform Design Techniques.

9.3.5 Low-Pass Filter Design.

9.3.6 Frequency Transformations.

9.3.7 Linear Phase.

9.4 Wideband Signal Analysis.

9.4.1 Chirp Detection.

9.4.2 Speech Analysis.

9.4.3 Problematic Examples.

9.5 Analog Filters.

9.5.1 Introduction.

9.5.2 Basic Low-Pass Filters.

9.5.3 Butterworth.

9.5.4 Chebyshev.

9.5.5 Inverse Chebyshev.

9.5.6 Elliptic Filters.

9.5.7 Application: Optimal Filters.

9.6 Specialized Frequency-Domain Techniques.

9.6.1 Chirp-z Transform Application.

9.6.2 Hilbert Transform.

9.6.3 Perfect Reconstruction Filter Banks.

9.7 Summary.

References.

Problems.

10 Time-Frequency Signal Transforms.

10.1 Gabor Transforms.

10.1.1 Introduction.

10.1.2 Interpretations.

10.1.3 Gabor Elementary Functions.

10.1.4 Inversion.

10.1.5 Applications.

10.1.6 Properties.

10.2 Short-Time Fourier Transforms.

10.2.1 Window Functions.

10.2.2 Transforming with a General Window.

10.2.3 Properties.

10.2.4 Time-Frequency Localization.

10.3 Discretization.

10.3.1 Transforming Discrete Signals.

10.3.2 Sampling the Short-Time Fourier Transform.

10.3.3 Extracting Signal Structure.

10.3.4 A Fundamental Limitation.

10.3.5 Frames of Windowed Fourier Atoms.

10.3.6 Status of Gabor’s Problem.

10.4 Quadratic Time-Frequency Transforms.

10.4.1 Spectrogram.

10.4.2 Wigner–Ville Distribution.

10.4.3 Ambiguity Function.

10.4.4 Cross-Term Problems.

10.4.5 Kernel Construction Method.

10.5 The Balian–Low Theorem.

10.5.1 Orthonormal Basis Decomposition.

10.5.2 Frame Decomposition.

10.5.3 Avoiding the Balian–Low Trap.

10.6 Summary.

10.6.1 Historical Notes.

10.6.2 Resources.

10.6.3 Looking Forward.

References.

Problems.

11 Time-Scale Signal Transforms.

11.1 Signal Scale.

11.2 Continuous Wavelet Transforms.

11.2.1 An Unlikely Discovery.

11.2.2 Basic Theory.

11.2.3 Examples.

11.3 Frames.

11.3.1 Discretization.

11.3.2 Conditions on Wavelet Frames.

11.3.3 Constructing Wavelet Frames.

11.3.4 Better Localization.

11.4 Multiresolution Analysis and Orthogonal Wavelets.

11.4.1 Multiresolution Analysis.

11.4.2 Scaling Function.

11.4.3 Discrete Low-Pass Filter.

11.4.4 Orthonormal Wavelet.

11.5 Summary.

References.

Problems.

12 Mixed-Domain Signal Analysis.

12.1 Wavelet Methods for Signal Structure.

12.1.1 Discrete Wavelet Transform.

12.1.2 Wavelet Pyramid Decomposition.

12.1.3 Application: Multiresolution Shape Recognition.

12.2 Mixed-Domain Signal Processing.

12.2.1 Filtering Methods.

12.2.2 Enhancement Techniques.

12.3 Biophysical Applications.

12.3.1 David Marr’s Program.

12.3.2 Psychophysics.

12.4 Discovering Signal Structure.

12.4.1 Edge Detection.

12.4.2 Local Frequency Detection.

12.4.3 Texture Analysis.

12.5 Pattern Recognition Networks.

12.5.1 Coarse-to-Fine Methods.

12.5.2 Pattern Recognition Networks.

12.5.3 Neural Networks.

12.5.4 Application: Process Control.

12.6 Signal Modeling and Matching.

12.6.1 Hidden Markov Models.

12.6.2 Matching Pursuit.

12.6.3 Applications.

12.7 Afterword.

References.

Problems.

Index.

From the B&N Reads Blog

Customer Reviews