Read an Excerpt
Single-Camera Video Production
By Robert B. Musburger
Copyright © 2005 Elsevier Inc.
All right reserved.
Chapter One The Technology
The Importance of Technology
If a would-be artist were to suddenly pick up a brush and start to dab paint on a canvas, or any other handy surface, the chances of achieving an immediate masterpiece would be minimal. The same holds true for a sculptor. One does not attack a piece of marble with a chisel without first learning the skills necessary to properly mold the form without damaging the original material or exceeding the capabilities of the medium.
Likewise, running through the woods with an out-of-focus camera may seem creative, but it is neither good art nor good video. An understanding of the basic technology of any art form is necessary to use properly the artistic characteristics of that medium and to avoid the pitfalls of its technical limitations.
Video is highly technical; the medium requires some basic knowledge of optics, electronics, electricity, physics, and mathematics. Of course, a video production can be completed without any knowledge of these subjects, but the possibility of it being a high-quality production is limited.
With the development of lighter, smaller, and more powerful equipment that operates using digital technology, you can create higher-quality video productions at a lower cost than was possible a few years ago. However, the advances in digital technology that make for better productions also require some knowledge of the digital domain and how it can and should be used in video production. Such digital equipment operates easily and with minimum knowledge of the media production process, but the ease of operation does not replace the thinking and creativity that is necessary for a quality production.
To use video cameras and associated audio equipment effectively, you must be aware of the capabilities and limitations of each piece of equipment. In addition, you must know how each piece of equipment operates in relation to other equipment used in the same production. This awareness does not necessarily mean having a broad range of knowledge of the technology involved in media production but rather having an appreciation and understanding of why the equipment is designed to operate as it does and what it can accomplish. Most important, it is necessary to understand what it cannot be expected to accomplish. Digital equipment does not replace the knowledge of composition, shot sequencing, or the construction of characters and story lines necessary to assemble a professional production. In fact, the basics of production are even more important in digital production because of the high level of resolution and clarity made possible in the digital formats. This clarity reveals poor lighting, bad framing, incorrect exposure, and all other gaffes that would barely manifest in analog production.
Limitations of Equipment
The human eye and ear are two extraordinary instruments for sensing light and sound. No human invention has ever come close to matching the capabilities of those two sensory organs. It is easy to forget how limited the electronic aural and visual equipment are until we compare them to their human counterparts.
The human eye can focus from nearly the end of the nose to infinity instantaneously. The eye can adjust to light variations quickly and can pick out images in light varying more than 1,000 times from the lightest to the darkest. The human ear can hear sounds varying in loudness from 0 decibels to more than 160 decibels and can respond to frequency changes from 15 hertz to more than 20,000 hertz.
The best professional digital camera cannot reveal detail in light variations greater than 80:1; most prosumer cameras have difficulty creating acceptable images beyond a contrast range of 40:1. The best lenses have limited focus range, and the depth of field depends on the amount of light present and focal length and f-stop settings. The best microphone is limited to less than 60 decibels in loudness range. Most audio equipment cannot reproduce frequencies beyond a range of 10,000 hertz without inconsistent variations. Digital equipment allows repeated duplication of signals without degradation, but it does little to extend the dynamic range of either sound or video.
It is important to remember that the audio/video equipment converts sound and light to electronic impulses and that, regardless of the expense or quality of the equipment used, it cannot capture everything that the human eye and ear can capture. The newest developments in digital cameras and audio processing have extended the ability of the recording process to more closely reproduce what a human can see and hear. However, the limitations of the equipment, whether analog or digital, taped or recorded on solid state, still play a crucial role in the planning of a video production.
The Audio Signal: Frequency
The audio signal has two basic characteristics: frequency (tone) and amplitude (loudness). To create with and record sound, these two characteristics must be understood. All sounds start as an analog signal because the vibration in air that creates sound is an analog motion. Digital sound is created within the equipment and must be converted back to analog for humans to hear it.
Frequency is measured in hertz (or cycles per second), usually abbreviated as Hz. Because most of the sounds humans can hear are more than 1,000 Hz, the abbreviation kHz (representing kilohertz) often is used; k is the abbreviation for kilo-, the metric equivalent of 1,000.
A cycle is the time or distance between peaks of a single sound vibration. A single continuous frequency is called a tone and is often used for testing. Humans perceive frequency as pitch, the highness and lowness of tones. The term timbre is a musical term often used in media production that refers to the special feeling a sound may have as a result of its source. For example, a note struck on the piano may be the same frequency as that of the same note played on a trumpet, but the timbre is very different.
The energy spectrum ranges from 0 Hz to greater than a yottahertz—a septillion hertz (a billion, billion, million hertz). The frequency range most humans can hear is between 10 Hz and 20 kHz. Frequencies greater than the audible human range include radiofrequencies (RFs) used as broadcast carrier waves, microwaves, X-rays and light, or the visible spectrum.
Most analog videotape recorders can record only audio frequencies between 30 Hz and 10 kHz. Digital recorders are capable of recording a broader range, but because human hearing is limited, so is digital recording. The frequencies excluded in recordings are not generally missed unless the production requires a wide range of frequency response, such as a music session.
The range of frequency response and certain portions of frequencies may be modified as needed for an individual production. This altering of the frequency response is called equalization. When you adjust the tone controls, treble or bass, on a stereo, you are equalizing the signal by modifying the frequency response. Although most videotape recorders do not have equalization controls, some audio mixers and microphones do.
The Audio Signal: Amplitude
Amplitude is the energy level of the audio signal. The listener perceives amplitude as loudness. Relative amplitude is referred to as level and is measured in decibels, abbreviated as dB. Deci- is one tenth on the metric scale, and bel is the measure of audio amplitude created by Alexander Graham Bell. Because the bel is a very large unit of measure, dB is more commonly used. The decibel is a somewhat confusing unit of measurement because it is a reference measurement of the change of the power of the signal. It is not an absolute measurement, and it is logarithmic, not linear; it can be expressed in either volts or watts. A change of at least 3 dB is necessary for the human ear to perceive a change in level. Levels are measured in both positive and negative decibels.
Volume is the term used when referring to the measurable energy that translates into loudness and that may be measured in either volume units (VUs) or decibels. Humans are sensitive to a change in volume, but human hearing is not linear. At some frequencies and at some volume levels, the ear senses a change but the actual measure of change is not registered accurately within the human brain. Robinson–Dadson Loudness Curves indicate the relative sensitivity humans have to various frequencies and loudness levels. Because analog audio equipment can handle a volume change no greater than approximately 60 dB, and even though digital recording equipment can handle a greater dynamic range, accurate level readings must be available during recording to avoid distorted or noisy sound in either analog or digital systems. There are two aberrations of audio to watch for: distortion and noise. Distortion is an unwanted change in the audio signal. The most common distortion is caused by attempting to record the audio at a level too high for the equipment. In digital audio systems, high levels may cause the audio to skip or cease entirely. Analog-distorted sounds are warped, wavering, or massive variations from the original sound. Noise is unwanted sound added to the audio. Digital systems are very sensitive to all sounds; therefore, noise may be added to a recording if not adequately monitored. Analog noise may be introduced into a program if the audio level is too low. If the level is too low, tape hiss or other sounds generated within the system may be heard and added to the original sound.
The audio level can be measured as it is being recorded using a VU meter, a peak-to-peak meter, or light-emitting diodes (LEDs). Each of these gives the operator an indication of the level of the audio. When the level is too high, the meters read greater than the 0 dB indicator; with LEDs, the changing color of the flashing diodes indicates the audio level. When the level is too low, the meter needles barely move, and few, if any, diodes flash.
The audio operator attempts to keep dynamic levels within the specified range of equipment, whether digital or analog, by attenuating the level (decreasing it) when the audio source is too loud and boosting the level (increasing it) when the audio source level is too low. This is called riding gain, and it may be done either manually by the operator or automatically by circuits built into the equipment called automatic gain controls (AGCs) or automatic level controls (ALCs). AGCs and ALCs will maintain certain maximum and minimum levels, but they may add noise by boosting levels during a soft or quiet passage or by overdriving if there is a sudden, loud increase in the input.
Dynamics refers to the difference between the loudest and the quietest passage. Most analog equipment is limited to a range of approximately 60 dB; newer digital equipment features dynamic ranges greater than 100 dB. Crickets at night might be heard at 3 dB, a normal conversation at 100 dB, and a rock concert at 130 dB or greater (which is above the threshold of pain and can be damaging to hearing).
To achieve the greatest possible audio quality, record and reproduce sound as close to the original as possible. Even though it is not possible to record all frequencies at the exact same level as the original sound was recorded, a successful operator makes the effort to exclude all noise and to avoid distorting the audio signal.
Two additional measurements are required for digital audio: sampling and quantization. Sampling is the number of times per second analog sound is measured as it is converted to digital. To transfer the maximum quality, samplings need to be done at twice the greatest expected frequency to be converted. Currently, the standard sampling rate is 48 kHz. Compact discs are sampled at 44.1 kHz. Quantization is the number of discrete levels at which analog sound is measured as it is converted. The greater the bit rate of quantization, the higher the quality of the conversion. Currently, 32- to 64-bit quantization is considered the professional rate. Increasing both the sampling and quantization rates increases the demand for memory and bandwidth for digital moving files.
The Video Signal
The video signal, like the audio signal, is made up of voltages varying in frequency and level. Even though the video electronic signal is complex, it should be considered in much the same manner as the audio signal. The camera cannot record all that the eye can see, nor can a videotape recorder—whether analog or digital—process all the information fed into it. Avoid video distortion and noise as vehemently as you avoid audio distortion and noise.
Video distortion and noise are defined in much the same way as audio distortion and noise, except that you can see video distortion as flare in brightly lit areas, as tearing, or as color shifts in the picture; video noise can be seen as a grainy or "crawly" texture to the picture.
Excerpted from Single-Camera Video Production by Robert B. Musburger Copyright © 2005 by Elsevier Inc.. Excerpted by permission of Focal Press. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.