Read an Excerpt
Chapter 1: Why Digital Television?For example, early transistor-transistor logic (TTL) chips contained between about 30 and 100 transistors-a huge technical breakthrough at the time. TTL was quickly replaced with complementary metal-oxide semiconductor (CMOS). With the reduced power dissipation of CMOS, larger integrated circuits were built. Soon, an entire computer processor could be placed on a single chip. Enter the microprocessor.
This phenomenon gave rise to one of the most frequently cited harbingers of technological change: Moore's Law. Hardly a technical presentation goes by-regardless of industry segment-without someone mentioning Moore's Law in the context of startlingly swift technological growth.
Moore's Law stems from Intel chairman emeritus Gordon Moore, who observed that the number of transistors on a chip was doubling every 18 months. When he confirmed this trend, it was dubbed Moore's Law, an axiom that continues with no end in sight until perhaps 2030. (For more information on Moore's Law, see http://www.intel.com/ pressroom/kits/bios/moore.htm.) Millions of transistors are now routinely placed on a silicon die, and many chips are now I/O-limited, which means that the cost of the chip has more to do with the number of leads and the packaging cost than the number of transistors it contains.
Very large scale integration (VLSI) encourages the designer to place as many functions as possible on a single chip. The ultimate goal: a single chip that performs all the functions of a product (whether it is a television receiver, a set-top, or a personal computer). Because it is tricky to mix analog and digital functions on a chip, it makessense to do all possible functions in the digital domain. For example, relatively complicated digital circuits are replacing even trivial analog functions, such as audio mixing.
When it was realized (in the 1970s) that almost all analog processing could be done with more precision and much greater flexibility in the digital domain, the race was on to shift more analog functions to digital.
The first step in this process is called analog-to-digital (AID) conversion. The analog signal is sampled (measured in time close enough together to adequately represent the analog signal) and its instantaneous value is represented as a binary value. After A/D conversion, most analog signal processing can be done in the digital domain. This technique is known as digital signal processing (DSP). After signal processing, a process known as digital-to-analog (D/A) conversion reconstructs the (modified) analog signal.
Early DSP applications had an analog input and an analog output, but soon the digital representation became the reference signal that was stored or transmitted. An early user of these techniques was the music industry, which embraced digital techniques so roundly, it is nearly impossible to purchase cassette tapes and vinyl records today. Their successor, the compact disc, was introduced in the 1980s and harnessed Moore's Law as a way to dramatically improve sound quality and the amount of music stored per CD (relative to analog predecessors). By the 1990s, digital techniques had evolved (thanks in part to Moore's law) to tackle the hundredfold increase in bandwidth of video (compared to audio). A/D conversion and DSP are now cost-effective tools for television services and have found their way into most of the technologies described in Chapter 4, "Digital Technologies." This trend shows no sign of slowing down and continues to drive the migration to digital television. Convergence with the Personal Computer Moore's Law also made the development of personal computers (PCs) practical. Early PCs were very limited in performance and memory-remember when 4 MB of RAM was a big deal? But reductions in price and quantum leaps in performance combined to create a multibillion dollar industry around the PC. Standalone PCs remain somewhat limited in what they can do; all applications must be loaded from stored media, and it is still somewhat slow and cumbersome to share data with other PC users. Still, PC networking is transforming the PC; it is now possible to pipe in applications and data from the Internet and to use the PC as a communications tool. The development of standard protocols to support World Wide Web services also introduced a new mode for research and entertainment. PCs are now powerful enough to perform sophisticated multimedia processing (using digital signal processing). Suddenly, convergence has reemerged as a buzzword to describe the personal computer as the focus of entertainment, computing, and communications services in the home. In the home, the notion of convergence will also create a divergence of in-home electronics, where the swift impact of Moore's Law creates customized, inexpensive chip-sets that can be installed in many communications and entertainment gadgets. The cable modem gained wide U.S. acceptance in 1999; industry analysis firm Paul Kagan Associates (PKA) anticipates that 1.6 million cable subscribers will use a cable modem, at $40 per month, to link to the Internet at high speed (27 Mbps shared over a node versus 56 Kbps via dial-up). That figure could leap to 20 million subscribers by 2005, according to PKA (including other broadband connectivity devices, such as DSL and wireless modems). By the end of 2000, as many as three million advanced digital set-tops (that include a cable modem) will populate U.S. homes. Add to that DVD players, personal organizers, and boxes such as those made by TiVo and Replay that enable truly on-demand television viewing. Convergence has many faces, but it is really just the parallel application of evolving digital technologies across different fields. The technology of the Internet and the World Wide Web are already finding their way into advanced analog set-top converters...