**Uh-oh, it looks like your Internet Explorer is out of date.**

For a better shopping experience, please upgrade now.

## Hardcover(Older Edition)

^{$}101.75

### **Temporarily Out of Stock Online**

## Overview

###### ADVERTISEMENT

## Product Details

ISBN-13: | 9780070539174 |
---|---|

Publisher: | McGraw-Hill Companies, The |

Publication date: | 05/28/1995 |

Edition description: | Older Edition |

Pages: | 600 |

Product dimensions: | 6.72(w) x 9.58(h) x 1.16(d) |

## About the Author

**Timothy J. Ross, University of New Mexico, USA**Dr. Ross is a professor within the Department of Civil Engineering at the University of New Mexico where he teaches courses in structural analysis, structural dynamics and fuzzy logic. He is a registered professional engineer with over 30 years’ experience in the fields of computational mechanics, hazard survivability, structural dynamics, structural safety, stochastic processes, risk assessment, and fuzzy systems. He is also the founding Editor-in-Chief of the International Journal, Intelligent and Fuzzy Systems.

## Read an Excerpt

#### Fuzzy Logic with Engineering Applications

**By Timothy J. Ross**

** John Wiley & Sons **

**Copyright © 2004**

**John Wiley & Sons, Ltd**

All right reserved.

All right reserved.

**ISBN: 0-470-86074-X**

#### Chapter One

**INTRODUCTION**

*It is the mark of an instructed mind to rest satisfied with that degree of precision which the nature of the subject admits, and not to seek exactness where only an approximation of the truth is possible. Aristotle, 384-322 BC Ancient Greek philosopher *

* Precision is not truth. Henri E. B. Matisse, 1869-1954 Impressionist painter *

* All traditional logic habitually assumes that precise symbols are being employed. It is therefore not applicable to this terrestrial life but only to an imagined celestial existence. Bertrand Russell, 1923 British philosopher and Nobel Laureate *

* We must exploit our tolerance for imprecision. Lotfi Zadeh Professor, Systems Engineering, UC Berkeley*, 1973

The quotes above, all of them legendary, have a common thread. That thread represents the relationship between precision and uncertainty. The more uncertainty in a problem, the less precise we can be in our understanding of that problem. It is ironic that the oldest quote, above, is due to the philosopher who is credited with the establishment of Western logic - a binary logic that only admits the opposites of true and false, a logic which does not admit degrees of truth in between these two extremes.In other words, Aristotelian logic does not admit imprecision in truth. However, Aristotle's quote is so appropriate today; it is a quote that admits uncertainty. It is an admonishment that we should heed; we should balance the precision we seek with the uncertainty that exists. Most engineering texts do not address the uncertainty in the information, models, and solutions that are conveyed within the problems addressed therein. This text is dedicated to the characterization and quantification of uncertainty within engineering problems such that an appropriate level of precision can be expressed. When we ask ourselves why we should engage in this pursuit, one reason should be obvious: achieving high levels of precision costs significantly in time or money or both. Are we solving problems that require precision? The more complex a system is, the more imprecise or inexact is the information that we have to characterize that system. It seems, then, that precision and information and complexity are inextricably related in the problems we pose for eventual solution. However, for most of the problems that we face, the quote above due to Professor Zadeh suggests that we can do a better job in accepting some level of imprecision.

It seems intuitive that we should balance the degree of precision in a problem with the associated uncertainty in that problem. Hence, this book recognizes that uncertainty of various forms permeates all scientific endeavors and it exists as an integral feature of all abstractions, models, and solutions. It is the intent of this book to introduce methods to handle one of these forms of uncertainty in our technical problems, the form we have come to call fuzziness.

**THE CASE FOR IMPRECISION**

Our understanding of most physical processes is based largely on imprecise human reasoning. This imprecision (when compared to the precise quantities required by computers) is nonetheless a form of information that can be quite useful to humans. The ability to embed such reasoning in hitherto intractable and complex problems is the criterion by which the efficacy of fuzzy logic is judged. Undoubtedly this ability cannot solve problems that require precision - problems such as shooting precision laser beams over tens of kilometers in space; milling machine components to accuracies of parts per billion; or focusing a microscopic electron beam on a specimen the size of a nanometer. The impact of fuzzy logic in these areas might be years away, if ever. But not many human problems require such precision - problems such as parking a car, backing up a trailer, navigating a car among others on a freeway, washing clothes, controlling traffic at intersections, judging beauty contestants, and a preliminary understanding of a complex system.

Requiring precision in engineering models and products translates to requiring high cost and long lead times in production and development. For other than simple systems, expense is proportional to precision: more precision entails higher cost. When considering the use of fuzzy logic for a given problem, an engineer or scientist should ponder the need for *exploiting the tolerance for imprecision*. Not only does high precision dictate high costs but also it entails low tractability in a problem. Articles in the popular media illustrate the need to exploit imprecision. Take the "traveling salesrep" problem, for example. In this classic optimization problem a sales representative wants to minimize total distance traveled by considering various itineraries and schedules between a series of cities on a particular trip. For a small number of cities, the problem is a trivial exercise in enumerating all the possibilities and choosing the shortest route. As the number of cities continues to grow, the problem quickly approaches a combinatorial explosion impossible to solve through an exhaustive search, even with a computer. For example, for 100 cities there are 100 × 99 × 98 × 97 × ··· × 2 × 1, or about [10.sup.200], possible routes to consider! No computers exist today that can solve this problem through a brute-force enumeration of all the possible routes. There are real, practical problems analogous to the traveling salesrep problem. For example, such problems arise in the fabrication of circuit boards, where precise lasers drill hundreds of thousands of holes in the board. Deciding in which order to drill the holes (where the board moves under a stationary laser) so as to minimize drilling time is a traveling salesrep problem [Kolata, 1991].

Thus, algorithms have been developed to solve the traveling salesrep problem in an optimal sense; that is, the exact answer is not guaranteed but an optimum answer is achievable - the optimality is measured as a percent accuracy, with 0% representing the exact answer and accuracies larger than zero representing answers of lesser accuracy. Suppose we consider a signal routing problem analogous to the traveling salesrep problem where we want to find the optimum path (i.e., minimum travel time) between 100,000 nodes in a network to an accuracy within 1% of the exact solution; this requires significant CPU time on a supercomputer. If we take the same problem and increase the precision requirement a modest amount to an accuracy of 0.75%, the computing time approaches a few months! Now suppose we can live with an accuracy of 3.5% (quite a bit more accurate than most problems we deal with), and we want to consider an order-of-magnitude more nodes in the network, say 1,000,000; the computing time for this problem is on the order of several minutes [Kolata, 1991]. This remarkable reduction in cost (translating time to dollars) is due solely to the acceptance of a lesser degree of precision in the optimum solution. Can humans live with a little less precision? The answer to this question depends on the situation, but for the vast majority of problems we deal with every day the answer is a resounding yes.

**AN HISTORICAL PERSPECTIVE**

From an historical point of view the issue of uncertainty has not always been embraced within the scientific community [Klir and Yuan, 1995]. In the traditional view of science, uncertainty represents an undesirable state, a state that must be avoided at all costs. This was the state of science until the late nineteenth century when physicists realized that Newtonian mechanics did not address problems at the molecular level. Newer methods, associated with statistical mechanics, were developed which recognized that statistical averages could replace the specific manifestations of microscopic entities. These statistical quantities, which summarized the activity of large numbers of microscopic entities, could then be connected in a model with appropriate macroscopic variables [Klir and Yuan, 1995]. Now, the role of Newtonian mechanics and its underlying calculus which considered no uncertainty was replaced with statistical mechanics which could be described by a probability theory - a theory which could capture a form of uncertainty, the type generally referred to as random uncertainty. After the development of statistical mechanics there has been a gradual trend in science during the past century to consider the influence of uncertainty on problems, and to do so in an attempt to make our models more robust, in the sense that we achieve credible solutions and at the same time quantify the amount of uncertainty.

Of course, the leading theory in quantifying uncertainty in scientific models from the late nineteenth century until the late twentieth century had been probability theory. However, the gradual evolution of the expression of uncertainty using probability theory was challenged, first in 1937 by Max Black, with his studies in vagueness, then with the introduction of fuzzy sets by Lotfi Zadeh in 1965. Zadeh's work [1965] had a profound influence on the thinking about uncertainty because it challenged not only probability theory as the sole representation for uncertainty, but the very foundations upon which probability theory was based: classical binary (two-valued) logic [Klir and Yuan, 1995].

Probability theory dominated the mathematics of uncertainty for over five centuries. Probability concepts date back to the 1500s, to the time of Cardano when gamblers recognized the rules of probability in games of chance. The concepts were still very much in the limelight in 1685, when the Bishop of Wells wrote a paper that discussed a problem in determining the truth of statements made by two witnesses who were both known to be unreliable to the extent that they only tell the truth with probabilities [*p*.sub.1] and [*p*.sub.2], respectively. The Bishop's answer to this was based on his assumption that the two witnesses were independent sources of information [Lindley, 1987].

Probability theory was initially developed in the eighteenth century in such landmark treatises as Jacob Bernoulli's *Ars Conjectandi* (1713) and Abraham DeMoiver's *Doctrine of Chances* (1718, 2nd edition 1738). Later in that century a small number of articles appeared in the periodical literature that would have a profound effect on the field. Most notable of these were Thomas Bayes's "An essay towards solving a problem in the doctrine of chances" (1763) and Pierre Simon Laplace's formulation of the axioms relating to games of chance, "Memoire sur la probabilite des causes par les evenemens" (1774). Laplace, only 25 years old at the time he began his work in 1772, wrote the first substantial article in mathematical statistics prior to the nineteenth century. Despite the fact that Laplace, at the same time, was heavily engaged in mathematical astronomy, his memoir was an explosion of ideas that provided the roots for modern decision theory, Bayesian inference with nuisance parameters (historians claim that Laplace did not know of Bayes's earlier work), and the asymptotic approximations of posterior distributions [Stigler, 1986].

By the time of Newton, physicists and mathematicians were formulating different theories of probability. The most popular ones remaining today are the relative frequency theory and the subjectivist or personalistic theory. The later development was initiated by Thomas Bayes (1763), who articulated his very powerful theorem for the assessment of subjective probabilities. The theorem specified that a human's degree of belief could be subjected to an objective, coherent, and measurable mathematical framework within subjective probability theory. In the early days of the twentieth century Rescher developed a formal framework for a conditional probability theory.

The twentieth century saw the first developments of alternatives to probability theory and to classical Aristotelian logic as paradigms to address more kinds of uncertainty than just the random kind. Jan Lukasiewicz developed a multivalued, discrete logic ([circa] 1930). In the 1960's Arthur Dempster developed a theory of evidence which, for the first time, included an assessment of ignorance, or the absence of information. In 1965 Lotfi Zadeh introduced his seminal idea in a continuous-valued logic that he called fuzzy set theory. In the 1970s Glenn Shafer extended Dempster's work to produce a complete theory of evidence dealing with information from more than one source, and Lotfi Zadeh illustrated a possibility theory resulting from special cases of fuzzy sets. Later in the 1980s other investigators showed a strong relationship between evidence theory, probability theory, and possibility theory with the use of what was called fuzzy measures [Klir and Wierman, 1996], and what is now being termed monotone measures.

Uncertainty can be thought of in an epistemological sense as being the inverse of information. Information about a particular engineering or scientific problem may be incomplete, imprecise, fragmentary, unreliable, vague, contradictory, or deficient in some other way [Klir and Yuan, 1995]. When we acquire more and more information about a problem, we become less and less uncertain about its formulation and solution. Problems that are characterized by very little information are said to be ill-posed, complex, or not sufficiently known. These problems are imbued with a high degree of uncertainty. Uncertainty can be manifested in many forms: it can be fuzzy (not sharp, unclear, imprecise, approximate), it can be vague (not specific, amorphous), it can be ambiguous (too many choices, contradictory), it can be of the form of ignorance (dissonant, not knowing something), or it can be a form due to natural variability (conflicting, random, chaotic, unpredictable). Many other linguistic labels have been applied to these various forms, but for now these shall suffice. Zadeh [2002] posed some simple examples of these forms in terms of a person's statements about when they shall return to a current place in time. *Continues...*

Excerpted fromFuzzy Logic with Engineering ApplicationsbyTimothy J. RossCopyright © 2004 by John Wiley & Sons, Ltd . Excerpted by permission.

All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

## Table of Contents

About the Author xi

Preface to the Fourth Edition xiii

**1 Introduction 1**

The Case for Imprecision 2

A Historical Perspective 4

The Utility of Fuzzy Systems 7

Limitations of Fuzzy Systems 9

The Illusion: Ignoring Uncertainty and Accuracy 11

Uncertainty and Information 13

Fuzzy Sets and Membership 14

Chance versus Fuzziness 17

Intuition of Uncertainty: Fuzzy versus Probability 19

Sets as Points in Hypercubes 21

Summary 23

References 23

Problems 24

**2 Classical Sets and Fuzzy Sets 27**

Classical Sets 28

Fuzzy Sets 36

Summary 45

References 46

Problems 46

**3 Classical Relations and Fuzzy Relations 51**

Cartesian Product 52

Crisp Relations 53

Fuzzy Relations 58

Tolerance and Equivalence Relations 67

Fuzzy Tolerance and Equivalence Relations 70

Value Assignments 72

Other Forms of the Composition Operation 76

Summary 77

References 77

Problems 77

**4 Properties of Membership Functions, Fuzzification, and Defuzzification 84**

Features of the Membership Function 85

Various Forms 87

Fuzzification 88

Defuzzification to Crisp Sets 90

λ-Cuts for Fuzzy Relations 92

Defuzzification to Scalars 93

Summary 102

References 103

Problems 104

**5 Logic and Fuzzy Systems 107**

Part I: Logic 107

Classical Logic 108

Fuzzy Logic 122

Part II: Fuzzy Systems 132

Summary 151

References 153

Problems 154

**6 Historical Methods of Developing Membership Functions 163**

Membership Value Assignments 164

Intuition 164

Inference 165

Rank Ordering 167

Neural Networks 168

Genetic Algorithms 179

Inductive Reasoning 188

Summary 195

References 196

Problems 197

**7 Automated Methods for Fuzzy Systems 201**

Definitions 202

Batch Least Squares Algorithm 205

Recursive Least Squares Algorithm 210

Gradient Method 213

Clustering Method 218

Learning from Examples 221

Modified Learning from Examples 224

Summary 233

References 235

Problems 235

**8 Fuzzy Systems Simulation 237**

Fuzzy Relational Equations 242

Nonlinear Simulation Using Fuzzy Systems 243

Fuzzy Associative Memories (FAMs) 246

Summary 257

References 258

Problems 259

**9 Decision Making with Fuzzy Information 265**

Fuzzy Synthetic Evaluation 267

Fuzzy Ordering 269

Nontransitive Ranking 272

Preference and Consensus 275

Multiobjective Decision Making 279

Fuzzy Bayesian Decision Method 285

Decision Making under Fuzzy States and Fuzzy Actions 295

Summary 309

References 310

Problems 311

**10 Fuzzy Classification and Pattern Recognition 323**

Fuzzy Classification 324

Classification by Equivalence Relations 324

Cluster Analysis 332

Cluster Validity 332

c-Means Clustering 333

Hard c-Means (HCM) 333

Fuzzy c-Means (FCM) 343

Classification Metric 351

Hardening the Fuzzy c-Partition 354

Similarity Relations from Clustering 356

Fuzzy Pattern Recognition 357

Single-Sample Identification 357

Multifeature Pattern Recognition 365

Summary 378

References 379

Problems 380

**11 Fuzzy Control Systems 388**

Control System Design Problem 390

Examples of Fuzzy Control System Design 393

Fuzzy Engineering Process Control 404

Fuzzy Statistical Process Control 417

Industrial Applications 431

Summary 434

References 437

Problems 438

**12 Applications of Fuzzy Systems Using Miscellaneous Models 455**

Fuzzy Optimization 455

Fuzzy Cognitive Mapping 462

Agent-Based Models 477

Fuzzy Arithmetic and the Extension Principle 481

Fuzzy Algebra 487

Data Fusion 491

Summary 498

References 498

Problems 500

**13 Monotone Measures: Belief, Plausibility, Probability, and Possibility 505**

Monotone Measures 506

Belief and Plausibility 507

Evidence Theory 512

Probability Measures 515

Possibility and Necessity Measures 517

Possibility Distributions as Fuzzy Sets 525

Possibility Distributions Derived from Empirical Intervals 528

Summary 548

References 549

Problems 550

Index 554