BN.com Gift Guide

Lessons in Estimation Theory for Signal Processing, Communications, and Control [NOOK Book]

Overview

Estimation theory is a product of need and technology. As a result, it is an integral part of many branches of science and engineering. To help readers differentiate among the rich collection of estimation methods and algorithms, this book describes in detail many of the important estimation methods and shows how they are interrelated. Written as a collection of lessons, this book introduces readers o the general field of estimation theory and includes abundant supplementary ...
See more details below
Lessons in Estimation Theory for Signal Processing, Communications, and Control

Available on NOOK devices and apps  
  • NOOK Devices
  • Samsung Galaxy Tab 4 NOOK 7.0
  • Samsung Galaxy Tab 4 NOOK 10.1
  • NOOK HD Tablet
  • NOOK HD+ Tablet
  • NOOK eReaders
  • NOOK Color
  • NOOK Tablet
  • Tablet/Phone
  • NOOK for Windows 8 Tablet
  • NOOK for iOS
  • NOOK for Android
  • NOOK Kids for iPad
  • PC/Mac
  • NOOK for Windows 8
  • NOOK for PC
  • NOOK for Mac

Want a NOOK? Explore Now

NOOK Book (eBook)
$53.49
BN.com price
(Save 42%)$92.99 List Price

Overview

Estimation theory is a product of need and technology. As a result, it is an integral part of many branches of science and engineering. To help readers differentiate among the rich collection of estimation methods and algorithms, this book describes in detail many of the important estimation methods and shows how they are interrelated. Written as a collection of lessons, this book introduces readers o the general field of estimation theory and includes abundant supplementary material.
Read More Show Less

Editorial Reviews

Booknews
Introduces the general field of estimation theory, describing estimation methods and algorithms and their interrelations. About half of the book is devoted to parameter estimation, while the other half concentrates on state estimation. Stresses estimation theory as a natural adjunct to classical digital signal processing and presents estimation from a discrete-time viewpoint. Includes summaries, questions, exercises, and answers, plus b&w graphs and diagrams. Annotation c. Book News, Inc., Portland, OR (booknews.com)
Read More Show Less

Product Details

  • ISBN-13: 9780132440790
  • Publisher: Pearson Education
  • Publication date: 3/28/1995
  • Sold by: Barnes & Noble
  • Format: eBook
  • Edition number: 2
  • Pages: 592
  • File size: 25 MB
  • Note: This product may take a few minutes to download.

Read an Excerpt

PREFACE: Estimation theory is widely used in many branches of science and engineering. No doubt, one could trace its origin back to ancient times, but Karl Friederich Gauss is generally acknowledged to be the progenitor of what we now refer to as estimation theory. R. A. Fisher, Norbert Wiener, Rudolph E. Kalman, and scores of others have expanded upon Gauss's legacy and have given us a rich collection of estimation methods and algorithms from which to choose. This book describes many of the important estimation methods and shows how they are interrelated.


Estimation theory is a product of need and technology. Gauss, for example, needed to predict the motions of planets and comets from telescopic measurements. This ``need'' led to the method of least squares. Digital computer technology has revolutionized our lives. It created the need for recursive estimation algorithms, one of the most important ones being the Kalman filter. Because of the importance of digital technology, this book presents estimation from a discrete-time viewpoint. In fact, it is this author's viewpoint that estimation theory is a natural adjunct to classical digital signal processing. It produces time-varying digital filter designs that operate on random data in an optimal manner.


Although this book is entitled ``Estimation Theory,' computation is essential in order to be able to use its many estimation algorithms. Consequently, computation is an integral part of this book.


It is this author's viewpoint that, whenever possible, computation should be left to the experts. Consequently, I have linked computation into MATLAB\registered\ (MATLAB is a registered trademark ofThe MathWorks, Inc.) and its associated toolboxes. A small number of important estimation M-files, which do not presently appear in any MathWorks toolbox, have been included in this book; they can be found in Appendix B.


This book has been written as a collection of lessons. It is meant to be an introduction to the general field of estimation theory and, as such, is not encyclopedic in content or in references. The supplementary material, which has been included at the end of many lessons, provides additional breadth or depth to those lessons. This book can be used for self-study or in a one-semester course.


Each lesson begins with a summary that describes the main points of the lesson and also lets the reader know exactly what he or she will be able to do as a result of completing the lesson. Each lesson also includes a small collection of multiple-choice summary questions, which are meant to test the reader on whether or not he or she has grasped the lesson's key points. Many of the lessons include a section entitledr


``Computation.'' When I decided to include material about computation, it was not clear to me whether such material should be collected together in one place, say at the rear of the book in an appendix, or whether it should appear at the end of each lesson, on demand so to speak. I sent letters to more than 50 colleagues and former students asking them what their preference would be. The overwhelming majority recommended having discussions about computation at the end of each lesson. I would like to thank the following for helping me to make this decision: Chong-Yung Chi, Keith Chugg, Georgios B. Giannakis, John Goutsias, Ioannis Katsavounidis, Bart Kosko, Li-Chien Lin, David Long, George Papavassilopoulos, Michael Safonov, Mostafa Shiva, Robert Scholtz, Ananthram Swami, Charles Weber, and Lloyd Welch.


Approximately one-half of the book is devoted to parameter estimation and the other half to state estimation. For many years there has been a tendency to treat state estimation, especially Kalman filtering, as a stand-alone subject and even to treat parameter estimation as a special case of state estimation. Historically, this is incorrect. In the musical Fiddler on the Roof, Tevye argues on behalf of


``Tradition!'' Estimation theory also has its tradition, and it begins with Gauss and parameter estimation. In Lesson 2 we show that state estimation is a special case of parameter estimation; i.e., it is the problem of estimating random parameters when these parameters change from one time instant to the next. Consequently, the subject of state estimation flows quite naturally from the subject of parameter estimation.


There are four supplemental lessons. Lesson A is on sufficient statistics and statistical estimation of parameters and has been written by Professor Rama Chellappa. Lessons B and C are on higher-order statistics. These three lessons are on parameter estimation topics. Lesson D is a review of state-variable models. It has been included because I have found that some people who take a course on estimation theory are not as well versed as they need to be about state-variable models in order to understand state estimation.


This book is an outgrowth of a one-semester course on estimation theory taught at the University of Southern California since 1978, where we cover all its contents at the rate of two lessons a week. We have been doing this since 1978. I wish to thank Mostafa Shiva, Alan Laub, George Papavassilopoulos, and Rama Chellappa for encouraging me to convert the course lecture notes into a book. The result was the first version of this book, which was published in 1987 as {\it Lessons in Digital Estimation Theory}. Since that time the course has been taught many times and additional materials have been included. Very little has been deleted. The result is this new edition.


Most of the book's important results are summarized in theorems and corollaries. In order to guide the reader to these results, they have been summarized for easy reference in Appendix A.


Problems are included for all the lessons (except Lesson 1, which is the Introduction), because this is a textbook. The problems fall into three groups. The first group contains problems that ask the reader to fill in details, which have been ``left to the reader as an exercise.''


The second group contains problems that are related to the material in the lesson. They range from theoretical to easy computational problems, easy in the sense that the computations can be carried out by hand. The third group contains computational problems that can only be carried out using a computer. Many of the problems were developed by students in my Fall 1991 and Spring 1992 classes at USC on Estimation Theory. For these problems, the name(s) of the problem developer(s) appears in parentheses at the beginning of each problem. The author wishes to thank all the problem developers.


While writing the first edition of the book, the author had the benefit of comments and suggestions from many of his colleagues and students. I especially want to acknowledge the help of Georgios B. Giannakis, Guan-Zhong Dai, Chong-Yung Chi, Phil Burns, Youngby Kim, Chung-Chin Lu, and Tom Hebert. While writing the second edition of the book, the author had the benefit of comments and suggestions from Georgios B. Giannakis, Mithat C. Dogan, Don Specht, Tom Hebert, Ted Harris, and Egemen Gonen. Special thanks to Mitsuru Nakamura for writing the estimation algorithm M-files that appear in Appendix B; to Ananthram Swami for generating Figures B 4, B 5, and B 7; and to Gent Paparisto for helping with the editing of the galley proofs.


Additionally, the author wishes to thank Marcel Dekker, Inc., for permitting him to include material from J. M. Mendel, Discrete Techniques of Parameter Estimation: The Equation Error Formulation}, 1973, in Lessons 1--3, 5--9, 11, 18, and 23; Academic Press, Inc., for permitting him to include material from J. M. Mendel, Optimal Seismic Deconvolution: An Estimation-based Approach}, copyright\,\copyright\,1983 by Academic Press, Inc., in Lessons 11--17, 19--21, and 25; and the Institute of Electrical and Electronic Engineers (IEEE) for permitting him to include material from J. M. Mendel, Kalman Filtering and Other Digital Estimation Techniques: Study Guide}, copyright, 1987 IEEE, in Lessons 1--3, 5--26, and D. I hope that the readers do not find it too distracting when I reference myself for an item such as a proof (e.g., the proof of Theorem 17-1). This is done only when I have taken material from one of my former publications (e.g., any one of the preceding three), to comply with copyright law, and is in no way meant to imply that a particular result is necessarily my own.


I am very grateful to my editor Karen Gettman and to Jane Bonnell and other staff members at Prentice Hall for their help in the production of this book.


Finally, I want to thank my wife, Letty, to whom this book is dedicated, for providing me, for more than 30 years, with a wonderful environment that has made this book possible.
Read More Show Less

Table of Contents

1. Introduction, Coverage, Philosophy, and Computation.


2. The Linear Model.


3. Least-Squares Estimation: Batch Processing.


4. Least-Squares Estimation: Singular-Value Decomposition.


5. Least-Squares Estimation: Recursive Processing.


6. Small Sample Properties of Estimators.


7. Large Sample Properties of Estimators.


8. Properties of Least-Squares Estimators.


9. Best Linear Unbiased Estimation.


10. Likelihood.


11. Maximum-Likelihood Estimation.


12. Multivariate Gaussian Random Variables.


13. Mean-Squared Estimation of Random Parameters.


14. Maximum A Posteriori Estimation of Random Parameters.


15. Elements of Discrete-Time Gauss-Markov Random Sequences.


16. State Estimation: Prediction.


17. State Estimation: Filtering (The Kalman Filter).


18. State Estimation: Filtering Examples.


19. State Estimation: Steady-State Kalman Filter and Its Relationships to a Digital Wiener Filter.


20. State Estimation: Smoothing.


21. State Estimation: Smoothing (General Results).


22. State Estimation for the Not-So-Basic State-Variable Model.


23. Linearization and Discretization of Nonlinear Systems.


24. Iterated Least Squares and Extended Kalman Filtering.


25. Maximum-Likelihood State and Parameter Estimation.


26. Kalman-Bucy Filtering.


A. Sufficient Statistics and Statistical Estimation of Parameters.


B. Introduction to Higher-Order Statistics.


C. Estimation and Applications of Higher-Order Statistics.


D. Introduction to State-Variable Models and Methods.


Appendix A: Glossary of Major Results.


Appendix B: Estimation of Algorithm M-Files.


References.


Index.
Read More Show Less

Preface

PREFACE: Estimation theory is widely used in many branches of science and engineering. No doubt, one could trace its origin back to ancient times, but Karl Friederich Gauss is generally acknowledged to be the progenitor of what we now refer to as estimation theory. R. A. Fisher, Norbert Wiener, Rudolph E. Kalman, and scores of others have expanded upon Gauss's legacy and have given us a rich collection of estimation methods and algorithms from which to choose. This book describes many of the important estimation methods and shows how they are interrelated.


Estimation theory is a product of need and technology. Gauss, for example, needed to predict the motions of planets and comets from telescopic measurements. This ``need'' led to the method of least squares. Digital computer technology has revolutionized our lives. It created the need for recursive estimation algorithms, one of the most important ones being the Kalman filter. Because of the importance of digital technology, this book presents estimation from a discrete-time viewpoint. In fact, it is this author's viewpoint that estimation theory is a natural adjunct to classical digital signal processing. It produces time-varying digital filter designs that operate on random data in an optimal manner.


Although this book is entitled ``Estimation Theory,' computation is essential in order to be able to use its many estimation algorithms. Consequently, computation is an integral part of this book.


It is this author's viewpoint that, whenever possible, computation should be left to the experts. Consequently, I have linked computation into MATLAB\\registered\\ (MATLAB is a registered trademarkofThe MathWorks, Inc.) and its associated toolboxes. A small number of important estimation M-files, which do not presently appear in any MathWorks toolbox, have been included in this book; they can be found in Appendix B.


This book has been written as a collection of lessons. It is meant to be an introduction to the general field of estimation theory and, as such, is not encyclopedic in content or in references. The supplementary material, which has been included at the end of many lessons, provides additional breadth or depth to those lessons. This book can be used for self-study or in a one-semester course.


Each lesson begins with a summary that describes the main points of the lesson and also lets the reader know exactly what he or she will be able to do as a result of completing the lesson. Each lesson also includes a small collection of multiple-choice summary questions, which are meant to test the reader on whether or not he or she has grasped the lesson's key points. Many of the lessons include a section entitledr


``Computation.'' When I decided to include material about computation, it was not clear to me whether such material should be collected together in one place, say at the rear of the book in an appendix, or whether it should appear at the end of each lesson, on demand so to speak. I sent letters to more than 50 colleagues and former students asking them what their preference would be. The overwhelming majority recommended having discussions about computation at the end of each lesson. I would like to thank the following for helping me to make this decision: Chong-Yung Chi, Keith Chugg, Georgios B. Giannakis, John Goutsias, Ioannis Katsavounidis, Bart Kosko, Li-Chien Lin, David Long, George Papavassilopoulos, Michael Safonov, Mostafa Shiva, Robert Scholtz, Ananthram Swami, Charles Weber, and Lloyd Welch.


Approximately one-half of the book is devoted to parameter estimation and the other half to state estimation. For many years there has been a tendency to treat state estimation, especially Kalman filtering, as a stand-alone subject and even to treat parameter estimation as a special case of state estimation. Historically, this is incorrect. In the musical Fiddler on the Roof, Tevye argues on behalf of


``Tradition!'' Estimation theory also has its tradition, and it begins with Gauss and parameter estimation. In Lesson 2 we show that state estimation is a special case of parameter estimation; i.e., it is the problem of estimating random parameters when these parameters change from one time instant to the next. Consequently, the subject of state estimation flows quite naturally from the subject of parameter estimation.


There are four supplemental lessons. Lesson A is on sufficient statistics and statistical estimation of parameters and has been written by Professor Rama Chellappa. Lessons B and C are on higher-order statistics. These three lessons are on parameter estimation topics. Lesson D is a review of state-variable models. It has been included because I have found that some people who take a course on estimation theory are not as well versed as they need to be about state-variable models in order to understand state estimation.


This book is an outgrowth of a one-semester course on estimation theory taught at the University of Southern California since 1978, where we cover all its contents at the rate of two lessons a week. We have been doing this since 1978. I wish to thank Mostafa Shiva, Alan Laub, George Papavassilopoulos, and Rama Chellappa for encouraging me to convert the course lecture notes into a book. The result was the first version of this book, which was published in 1987 as {\\it Lessons in Digital Estimation Theory}. Since that time the course has been taught many times and additional materials have been included. Very little has been deleted. The result is this new edition.


Most of the book's important results are summarized in theorems and corollaries. In order to guide the reader to these results, they have been summarized for easy reference in Appendix A.


Problems are included for all the lessons (except Lesson 1, which is the Introduction), because this is a textbook. The problems fall into three groups. The first group contains problems that ask the reader to fill in details, which have been ``left to the reader as an exercise.''


The second group contains problems that are related to the material in the lesson. They range from theoretical to easy computational problems, easy in the sense that the computations can be carried out by hand. The third group contains computational problems that can only be carried out using a computer. Many of the problems were developed by students in my Fall 1991 and Spring 1992 classes at USC on Estimation Theory. For these problems, the name(s) of the problem developer(s) appears in parentheses at the beginning of each problem. The author wishes to thank all the problem developers.


While writing the first edition of the book, the author had the benefit of comments and suggestions from many of his colleagues and students. I especially want to acknowledge the help of Georgios B. Giannakis, Guan-Zhong Dai, Chong-Yung Chi, Phil Burns, Youngby Kim, Chung-Chin Lu, and Tom Hebert. While writing the second edition of the book, the author had the benefit of comments and suggestions from Georgios B. Giannakis, Mithat C. Dogan, Don Specht, Tom Hebert, Ted Harris, and Egemen Gonen. Special thanks to Mitsuru Nakamura for writing the estimation algorithm M-files that appear in Appendix B; to Ananthram Swami for generating Figures B 4, B 5, and B 7; and to Gent Paparisto for helping with the editing of the galley proofs.


Additionally, the author wishes to thank Marcel Dekker, Inc., for permitting him to include material from J. M. Mendel, Discrete Techniques of Parameter Estimation: The Equation Error Formulation}, 1973, in Lessons 1--3, 5--9, 11, 18, and 23; Academic Press, Inc., for permitting him to include material from J. M. Mendel, Optimal Seismic Deconvolution: An Estimation-based Approach}, copyright\\,\\copyright\\,1983 by Academic Press, Inc., in Lessons 11--17, 19--21, and 25; and the Institute of Electrical and Electronic Engineers (IEEE) for permitting him to include material from J. M. Mendel, Kalman Filtering and Other Digital Estimation Techniques: Study Guide}, copyright, 1987 IEEE, in Lessons 1--3, 5--26, and D. I hope that the readers do not find it too distracting when I reference myself for an item such as a proof (e.g., the proof of Theorem 17-1). This is done only when I have taken material from one of my former publications (e.g., any one of the preceding three), to comply with copyright law, and is in no way meant to imply that a particular result is necessarily my own.


I am very grateful to my editor Karen Gettman and to Jane Bonnell and other staff members at Prentice Hall for their help in the production of this book.


Finally, I want to thank my wife, Letty, to whom this book is dedicated, for providing me, for more than 30 years, with a wonderful environment that has made this book possible.
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)