- Shopping Bag ( 0 items )
Based on a tutorial workshop, this book overviews the technical details involved in a cable system. A complete descriptive reference of a cable television system. This book is the most up-to-date and comprehensive reference available on cable television technologies. It covers issues not addressed in any other book such as modern headend design, reliability calculations, modern architecture, and equipment interface. Summarizes key standards including DOCSIS, DVS, NRSS, DSWG, EIA-542, and IS-23
The interest in digital television goes back several decades. Long before digital television was practical for consumer use, it held interest for military and professional applications. The principal attractions are those of all digital signals: the ability to transmit signals over arbitrarily long distances and to store the signals without degradation; the ability to process signals for security and other purposes; the ability to remove unnecessary redundancy to increase transmission and storage efficiency; and the ability to add appropriate coding and redundancy so that signals can survive harsh transmission environments. In the late 1960s, Zenith Radio Corporation worked on the digitization of television signals for the purpose of enhancing scrambling for an over-the-air subscription television Service.
As we mentioned, to be properly reproduced, an analog signal must be sampled at a rate at least twice that of its highest frequency. That theoretically minimum rate requires perfect (nonrealizable) filters at the receiving end. To use practical filters, an even higher sampling rate is required. The most common sampling rates chosen for use in digital television with resolution comparable to NTSC are 13.5 MHz for the luminance signal and half that rate for each of the two color difference signals for a total of 27 megasamples per second.
Interestingly, the use of compression allows the program bandwidth and the transmitted bandwidth to be very different. A modern television camera for NTSC actually produces video frequencies beyond what can be transmitted in the 6MHz broadcast channel. Thosefrequencies are just thrown away in analog practice. In compressed television, some of them may be used. The 13.5-MHz sampling rate supports a luminance bandwidth of 5.75 and 2.75 MHz in each of the color difference channels. The NTSC technique of choosing color axes that favor the human visual system and limiting one to 0.5 MHz while allowing the other to go to 1.5 MHz is not necessary. Compression allows both color difference signals to have the same 2.75-MHz bandwidth.
It was determined by experiment that a minimum of eight bits per sample are required to minimize visible artifacts due to the quantization of the data. In some applications, ten bits per sample are used. In digital television systems using 27 megasamples per second and eight bits per sample, uncompressed transmission requires 215 Mb/s for the video data. Of course, audio must be digitized as well.
Using our bandwidth rule of thumb, something over 100 MHz of bandwidth would be required. This is not a problem internal to systems that process a single channel or for transmission of a few channels over fiber-optic links. But this kind of bandwidth consumption is not acceptable for broadcast or for multichannel environments where the number of channels is important to subscriber appeal. There is a clear need for methods of reducing the bit rate to acceptable levels.
The big push for consumer digital television came as a consequence of the search for a practical high-definition television (HDTV) system. HDTV was defined as having twice the horizontal resolution, twice the vertical resolution, a wide picture with a ratio of 16 units of width for every 9 units of height, no visible artifacts at reasonable viewing distances, and compact disc quality sound. Before any processing, the analog signal coming from an HDTV camera could consist of 30 MHz of red information, 30 MHz of green information, and 30 MHz of blue information. Almost 100 MHz of analog information is involved. If this is converted to digital form, it is first sampled at more than twice the highest frequency. Then each sample is represented by a byte of data. More than a gigabit per second of data transmission is required. it can be appreciated why all the early HDTV proposals were analog! Methods of reducing this bandwidth appetite had to be found. The work on HDTV began in Japan in 1970 using analog transmission technology coupled with significant digital processing at both the point of origination and at the receiver. The Japanese proposal was called Multiple SubNyquist Sampling Encoding (MUSE). It applied many of the techniques used to minimize NTSC bandwidth. The goal was to match the HDTV system to the human visual response since there is no need to transmit what the eye does not see.
When the FCC decided to pursue HDTV in the United States, it laid down a number of challenges. The first challenge was for a "compatible" HDTV system that would not obsolete current NTSC receivers. The second major challenge was that the original NTSC and the new HDTV signals were to be limited to no more than two 6-MHz slots in the television spectrum. A further restriction required the new signals to avoid undue harm to existing NTSC transmissions. After an extended search for a compatible method of creating HDTV, it became clear that all methods proposed used the original NTSC signal plus in-band and out-of-band "helper signals." All these resources were required to create the compatible signal, and two 6-MHz bands were consumed for each HDTV signal. This approach meant that NTSC would always have to be supported. It also meant that 12 MHz was committed for each HDTV signal. If there were ever to be a transition away from broadcast NTSC, this approach would have to be abandoned.
Zenith Electronics Corporation first broke ranks with the analog proponents by proposing a hybrid system that continued the transmission of the high frequencies of the image in analog form but converted the lower frequencies to a digitized form. This hybrid approach seemed to use the best of both worlds. It recognized that most of the energy in an NTSC signal is in its low frequencies that include the synchronization pulses. In NTSC, the sync pulses have the highest energy because they are separated from the video signal by amplitude discrimination. By digitizing the low frequencies, the majority of their power consumption was eliminated. Yet the burden on the digital circuits was relaxed because only relatively low frequencies were processed. The high frequencies remained analog and contributed little to the power requirements. The lower data rate digital signals might also be less susceptible to multipath. The remaining problem was that this approach was no longer compatible with existing NTSC receivers. This problem was solved by allowing compatibility to include simulcasting. That is, both the hybrid signal and the NTSC signal would carry the same programming, but at two different resolutions and at two different frequencies. This would preserve the utility of older receivers. And since no successful system that put both NTSC and HDTV into the same 6 MHz had been proposed, two 6-MHz channels would still be required, as in all the other proposed systems.
This approach had a number of major advantages. The low-power HDTV signal was tailored to cause minimum co-channel interference in adjacent locations using the same frequencies. Also, since the NTSC signal was not needed as a component of the HDTV signal, it could eventually be abandoned. The NTSC channel could then be reallocated to other purposes. Even before that happened, the requirement for simulcasting could be relaxed based on policy rather than technological constraints. By this step-by-step process, compatibility was abandoned for the first time in broadcast television. (The noncompatible CBS color system, though temporarily the official system, did not achieve commercial success before it was replaced with the RCA/NTSC compatible color system.)
Shortly thereafter, General Instrument Corporation proposed an all-digital solution. Quickly, all serious proponents with the exception of the Japanese MUSE system converted to all-digital designs. The committee, charged with selecting a winner, found that it could not come to a conclusion. The technical issues were too complex, and the political issues were overwhelming. At the time a decision was to be made, all the proposed systems made unacceptable pictures. The result was an elaborate ruse to score all the systems as acceptable under the condition that a "grand alliance" be formed that allowed the proponents themselves to hammer out a single system. Thus the political battles could go on behind closed doors under the guise of selecting the "best of the best" for a single proposal to the FCC.
The television and computer industries have jointly created standardized toolkits for processing images and maximizing the efficiency of transmission and storage of the resulting digital realizations. Two important systems for accomplishing this are known as the Moving Picture Experts Group (MPEG) standards. The MPEG standards consist of collections of techniques that can be selected depending on the nature of an application. This progress in the area of digital TV bandwidth compression resulted in a national standard being selected by the FCC in December 1996. Included is a modulation scheme that conveys around 20 Mb/s in 6 MHz. If the original raw HDTV signal needed a gigabit per second of data, the resulting compression ratio is 50: 1. Roughly 2 % of the original information is really needed. Ninety-eight percent of the original information is redundant and can be discarded. Using this standard, a single HDTV program can now be transmitted within the analog broadcast TV channel assignment of 6 MHz rather than...