- Shopping Bag ( 0 items )
Why should communication be any different for the twenty-first century than it has been in the recent past? Language changes slowly over time, but in any period not everyone is adept at using it. Still, we take our ability to communicate with one another for granted as though it were immutable, requiring little attention. We may be unaware or uncaring about ineffective communication habits. Yet these can get in the way of interpersonal relationships, even inter-group relationships, which provide a practical reason at any time for improving our skills.
Much has happened, though, in the last half of the twentieth century to make improving the effectiveness of communication appropriate, even necessary—that is, getting through to people by what we say and how we say it, rather than simply getting to them by the way we transmit it. These skills go beyond grammar, which we all hated to study in school; they go beyond vocabulary, which can be accumulated or improved; and they go beyond style of word usage and phraseology, which can depend upon linguistic background, education, and personal characteristics.
The skills I refer to involve the way our communication reflects the way we think. They involve a mindset that is sorely needed to keep up with the pace and demands of the twenty-first century. Think of all that has happened in the past one hundred years!
Globalization is one obvious change. The jet plane, the credit card, and the internationalization of corporations have brought peoples and nations of diverse languages and cultures into intimate contact with one another. In the mid-twentieth century, the United Nations pointed out the need for simultaneous translation, but even that capability did not prevent misunderstanding because of subtleties of word meanings and tones of expression. Diplomatic language can veil ambiguity on purpose, but business communication between different languages needs to be unambiguous to be effective.
Electronic technology has almost exploded with possibilities. The Internet is the most recent example that brings diverse people together, as well as friends who may be reached by cell phone. Using the speed of light, it epitomizes the instantaneity with which information can be transmitted. This implies that the chance—or risk—of acting on the information can become even more critical than when a telephone "hot line" was set up between the Soviet Union and the United States during the Cold War. Other frontiers have been breached: electronic voice and facial recognition, artificial translation, even artificial intelligence, for example.
The computer has allowed investigation of several variables at once. Statistics, as will be seen in Chapter 13, has become very sophisticated as a mathematical tool to check the validity of information from scientific research and from public opinion polls. The computer facilitates the use of statistics and helps to speed up even more research.
Information has exploded. With all of the easily observable aspects of nature already investigated, the most hidden ones are now the subjects of PhD dissertations. Since World War II, achieving a doctoral degree has become the minimum requirement for many professions. So we came to the "publish or perish" era. There is such a proliferation of information that it is now often said that today the key to power is information.
With every type of organization flooding us with brochures and blandishments, "junk mail" and telemarketing have brought us to an era of information pollution. What does this say about the significance of the content of what is published? What does it say about our ability to absorb the information and savor its subtleties? It says we need to burnish this wonderful tool of language and our skills in using it.
Finally, communication as a subject that has blossomed into a specific discipline, treated as a science. More will be said about this in later chapters, but first, more fundamental advances must be noted.
Two Science Giants
Early in the twentieth century, two powerful scientific concepts were enunciated from the realm of Physics, and they are germane to our subject. How could such seemingly unrelated subjects, such as physics, communication, and making judgments, be related? This will become apparent. Although each concept deals with the most arcane aspects of science, and although each concerns opposite ends of the spatial scale, their implications have helped to shape the way we size up our world.
The two concepts are Einstein's Relativity Theory and Heisenberg's Uncertainty Principle. Don't be terrified by them! One needn't get technical to see how they play a part in effective communication.
As to Relativity, many of us have had experiences that demonstrate in a simple way what Einstein was essentially saying. If a person is on a moving train, and a train on an adjacent track is going in the opposite direction, one cannot tell, apart from feeling the motion, whether the train you are on is standing still and the other one is going backward, or whether the adjacent train is standing still and yours is going forward. Einstein pointed out that while we on Earth consider ourselves to be on a stable, fixed platform, when one gets into outer space, where all the heavenly bodies are in motion, it is impossible to measure exactly where any body is relative to some fixed reference point. Thus, in space, all positions are relative.
Furthermore, because measurements in space depend on light (or other electromagnetic radiations) and because light is dependent on its speed, time is a major determinant in measuring distance in space. So, the second powerful change in conceptualization that Einstein introduced was that time and space are interrelated; they are opposite sides of the same coin.
Apart from this powerful and fundamental way of thinking about concepts that we on Earth tend to give absolute values to, the aspect of this conceptualization that is even more relevant to the way we form evaluations is Einstein's very mode of thinking. This was unique in four respects: One was the way he conceived abstractions; second was the non-absolute nature of these concepts (both of these will be covered in Chapter 12); third was their interrelationships; and fourth, the implication that such concepts depend upon eternal motion and change.
Coming down to earth, what these modes of thinking translate to are the questions we ought to ask ourselves when we talk about abstractions. What do we really mean by "terrorism," "family values," "free enterprise"? How do we measure these ideas? How fixed are our views about them, and therefore how certain can we be of our assessments of them at any one time?
Take the idea of cause and effect, for example. Think how many interpersonal arguments revolve around one party blaming the other for causing a problem. Think how this is scaled up to the arguments between nations. Can a cause be considered in isolation? Is it not the effect of a prior cause that we need to seek? Was the cause perceived at a given time really the source of a problem at the time it was introduced?
And is not our assessment of "the cause" interrelated with many other factors? Was the collapse of the World Trade Center towers the airplanes that crashed into them, an inherent weakness in their structural design, or a Jihad by some terrorists? Was the crash of the American Airlines plane in November 2001 the result of the pilot's overuse of the rudder to control the craft when it encountered turbulence, that disturbance in itself, or a weakness in that plane's design? The tendency is to look for the major cause when most problems result from both a prior condition and an activating event. The tendency to engage in the blame game is like little children claiming, "No, you started it!"
Think, too, of how we characterize people and events with adjectives. We may say someone is clumsy or intelligent, or that an event is disgusting or marvelous. We do so only with implicit and subjective bases for comparison. But such terms cannot be calibrated as scientists do with their instruments.
Scientists have definitive standards for their technical terms. But even with the dictionary, how standard are our definitions? Connotations may change as language changes, however slowly. More importantly, what standards can we have for our evaluations, which frequently involve these connotations as well as imprecise, abstract ideas? Scientific terms are defined operationally. That is, what does the term mean in action, what does the concept do?
This mode of thought is a model to emulate to avoid evaluating poorly.
Einstein's ideas about space and time can be said to imply an indeterminacy of location. At the other end of the physical world, in quantum physics, Werner Heisenberg concluded that if one could determine the position of a subatomic particle within the atom, one could not determine its motion (related to its energy), or, conversely, determining its motion would not allow one to locate its position at any instant. This is almost like viewing a motion picture, where an individual frame cannot be discerned, or, if one sees the latter, one cannot see the motion. More important for its relevance to communication, however, was Heisenberg's realization that the mere act of observing or trying to measure such a particle would affect its properties. This is essentially because the means needed to make such measurements are of the same order of magnitude as the thing measured. One needs a finer tool to measure something coarser.
An example from ordinary life is the way phrasing a question can affect the answer. If a teacher is asked, "Do you feel minority students are underachievers?" the question has two built-in biases, which would be absent if the teacher were asked, "What factors play a part in student achievement?"
Political polls are notorious for slanting the questions so as to produce results that the poll-taking organization favors. We may engage in this sort of manipulation unconsciously and innocently in casual conversation. A woman, with a beaming smile, may ask a friend, "How do you like my new dress?" thus encouraging approval. Instinctively, we all know that our demeanor can affect another person's behavior toward us.
Indeterminacy, as most of us know, is an aspect of normal life. Some of us cope better with such uncertainty than others. Religious people may placate themselves by a belief in God, whom they say "moves in mysterious ways." Or they may accept that, "It is not up to man to answer these questions." In assessing a situation or a problem, such people may accept troublesome aspects or may not take any action to try to change. A more scientific mindset would undoubtedly take either of two paths: either ferret out evidence and try to reason through alternatives to possible solutions, or, with presently imponderable questions such as "What's beyond the Universe?" or "What was there before the Big Bang?" the indeterminacy would be accepted precisely because there is no evidence for an answer. Scientists may certainly believe in God in a spiritual sense, but it is unlikely that most scientists would hold that there is a Being "up there" who plans and controls everything. The reason is that the religious view of God is not based on evidence, certainly not on universally accepted evidence of the same kind. Nor is what some might consider "evidence" independently and consistently reproducible as is required to be the case with scientific experiments.
Acceptance of indeterminacy and awareness of tolerances are not simply mechanical aspects of the scientists' mode of thought; they are psychological aspects of a mindset. That is, while a scientist may have a gnawing dissatisfaction because he or she has not found an explanation for some phenomenon, this is an intellectual challenge; it is not the same kind of troubling anxiety that leads many people to jump to conclusions with erroneous evaluations about vexing problems for which they don't know the cause and can't work out a cure.
Two Others of Note
Two other thinkers of the past century grappled with questions that have to do with communication and judgments. One question arose from the "language of science," mathematics; the other grew from statements about logical propositions.
In the latter part of the nineteenth century, theoretical mathematicians—off in their recondite realm—began to ponder a vexing problem: How could they be sure that the axioms, the bottom-line assumptions upon which they based their reasoning, could be valid for all future theorizing? Could one draw an inference from some axiom that would be inconsistent with another inference that could be drawn from that same axiom? These aspects of validity and consistency hover over all scientific theorizing and are important considerations for opinion forming by the average layman.
In 1914, Kurt Goedel concluded that it would be impossible ever to prove that a set of axioms could be both complete and consistent. If they were complete—that is, for all possible cases—they could not be consistent, and vice versa. This theorem was almost a parallel in logic for Heisenberg's principle. Undoubtedly, this may seem like airy philosophizing, but it had some very basic results: enunciation of the logical rules by which even the axioms were formulated. These were expressed in symbolic logic with signs representing "if," "and," "if this, then ...," and other elements that tie statements together.
Even these expressions have to be described at some point by words. Early in the last century, Ludwig Wittgenstein pondered the question of how one could state something in the simplest, most basic, unambiguous manner, such that each word would be intuitively understood without requiring further definition. How could one state the most elemental truth in words? Since language is needed to explore language, we run into the same measurement problem that Heisenberg pointed out. The system couldn't be evaluated by the rules of that system; one couldn't be both judge and juror. Wittgenstein's speculations about the word-as-linguistic-element versus the word-as-indicator led to his claim that the truth of a proposition depends on its verifiability, not on logical thinking alone. Evidence and operational definition strike again! Wittgenstein's work heightened the interest of linguistics, philosophers, and writers in general about the significance of what we say and how we say it.
Communication as a Science
So what qualifies the study of how we talk and think as a science? We think of the latter as a matter of test tubes and microscopes, or more loftily as a pursuit of finding out what makes things tick or as a curiosity about the unknown. Science means orderly experimentation and verification of results. More fundamentally, it is a mode of thought. As to how this subject became a science, first, some history.
Toward the end of the nineteenth century, linguistics became a subject of study, particularly in the United States regarding Native American language. Psychology, too, developed as a formal discipline, with studies of how people perceived things, optically and mentally. After World War II, the dynamics of group interactions became a sub-discipline. More recently, the dynamics of resolving conflicts have been explored. That has a most practical goal of communication between individuals and between groups as large as nations. With Nazi propaganda techniques as a goad, the study of persuasion techniques became another sub-discipline. Even subtle behaviors, such as body language, personal space, and movement of eye pupils, became the bases for research as aspects of communication.
Unrealized but underlying all of these aspects is the premise of science. This mode of thought and its protocols of application provide the warp and woof on which we weave the tapestry of virtually every aspect of life today. So, a few words will be said about what this, too, involves.
The minimum requirement is observation. This might seem obvious, but it is not so simple. It implies not merely acuteness of what is seen or heard, but an avoidance of selectively choosing what one observes. Furthermore, one observation will not suffice. In order to have significance, duplication is necessary.
Even though scientific instruments are constantly improved to increase precision of observation, the scientific mindset accepts that the data obtained may not have an absolute value. Data have tolerances of plus or minus a certain amount. These depend upon the quality of the instrument and the limitations of human observation. Measurements also depend on comparisons with standards against which instruments, materials, and processes are calibrated.
Consideration of any observation involves its relationship to its context of time and place and to other similar or contrary observations.
Scientists know, too, that it is not sufficient to say that there is no evidence of something, as evidence could be lost, inadequately sought, or destroyed. More basically, it is always possible that somewhere in the universe at some time something might occur or exist. So scientists cannot prove that something does not exist, but they can disprove (by obtaining evidence) that something does exist. This is referred to as the Null Hypothesis.
Whereas a person who didn't have a scientific bent might take an observation as merely an interesting fact, scientists almost automatically look for a cause. That would lead to a tentative idea, a hypothesis, to be followed up by repeated experiments. These aim to show the validity of the idea by preferably using different methods to demonstrate that the idea can be achieved in several ways.
Excerpted from THINK SMART, TALK SMART by Allan Laurence Brooks Copyright © 2011 by Allan Laurence Brooks. Excerpted by permission of iUniverse, Inc.. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.