|The transmission towers on Mt. Mansfield in Stowe, VT are home to WCAX-DT and other local broadcasters.|
Although broadcasting has been around for more than a century, it has only been associated with digital technology for the last 10 to 15 years. When we think of digital television or radio today, all but those skilled in the science of RF engineering would believe that the systems are completely digital. At the same time, most people would not associate digital with early radio communications and experimental television. In fact, digital and analog technologies are tightly woven together across the history of broadcast technology.
Broadcasting quickly evolved from early radio communications. The concept of broadcasting (i.e., reaching many with one transmission and message) sprang from early wireless communications. The most basic transmission technology of this communication media began in the era of spark. It evolved into high-frequency alternator technology and then to tube-based oscillator and amplifier technology.
One common theme across these technologies is that the RF signal starts with a single- or multiple-carrier analog waveform, which is a requirement to transmit the signal in a finite bandwidth. That fact holds true for all digital communications and broadcasting today. It is only the way that the message modulates the analog RF waveform that differentiates between analog and digital broadcast systems.
The first radio communications that led to broadcasting were digital in nature. For these transmissions, the RF carrier wave was turned on and off in a pattern that was able to communicate a message. In that case, it was Morse code. The earliest of these transmissions began in the late 1890s.
|Shown is a current VHF transmitter installation as well as a tower/antenna.|
By 1906, the concept of voice communications and broadcasting evolved, as demonstrated by Reginald Fessenden during his Christmas Eve broadcast of an opera singer to ships at sea. Fessenden superimposed voice communications on the RF carrier wave by varying the amplitude of the transmitted signal, a technique that became known as amplitude modulation (AM).
AM broadcasting continued as an experimental activity until 1920, when the Department of Commerce issued the first commercial broadcast license to Westinghouse Electric Co. (Pittsburgh, PA) to operate radio station KDKA. In a matter of a few short years, licenses were issued and stations sprang up in a variety of locations around the US. All of these stations relied on an entirely analog transmission system.
The number of stations grew as consumer interest in broadcasting mushroomed. Broadcast spectrum allocation was expanded, and many stations pushed for increasingly higher power levels to improve reliability and reception quality.
Broadcast station WLW in Cincinnati, OH was an example of the evolution of AM broadcasting to higher power levels. This was the pinnacle of early high-power AM broadcastingreached in 1934 when Powell Crosley's WLW signed on with 500,000 W of power and became the "Nation's Station" (Fig. 1 and Fig. 2).
The RCA Victor Co., working with subcontractors General Electric and Westinghouse, designed a unique 500,000-W AM transmitter for WLW (Fig. 3). This transmitter was designed and built at the WLW transmitter site in Mason, OH by a young team of electrical engineers whose average age was 27. It was a marvel of RF technology in 1934, and generations of RF engineers still learn from it today.
WLW operated with 500,000 W continuously from 1934 until 1939, when the Federal Communications Commission (FCC) placed a 50,000-W limit on AM broadcast power in the US. Experimental operation continued at 500,000 W until 1943. Fig. 4)>.
Two other broadcast technologies were developing at this time. Television was under development by several groups around the country. In 1928, AT&T demonstrated a studio-televised image transmitted via long lines between New York and Philadelphia. The hybrid system used analog to control the brightness of the individual picture elements. A progressive, digital electromechanical switching system scanned the matrix of sensors and display lamps in the transmission device and receiver. Not long after the AT&T demonstration, Philo T. Farnsworthoften referred to as the father of televisiondemonstrated analog sawtooth scanning for television. TV was off on an all-analog race until the late 1980s.
Transition From AM To FM
The other modulation technology under development in the 1930s was frequency modulation (FM), spearheaded by Edwin H. Armstrong at Columbia University. Armstrong was trying to solve the same problem that Crosley and others were trying to solve with AM and RF power. He realized that all of the noise and static that interfered with AM broadcasting was a disturbance, which was a variance of amplitude. Armstrong theorized that if the modulation of an RF carrier was created by varying the carrier frequencyand the receiver limited all variations of the received signal's amplitudea broadcasting system with much lower noise levels and a high immunity to static could be developed.
Armstrong did extensive developmental work, proving that FM radio transmissions were possible. Many theoreticians claimed to have proof that Armstrong's experiments were impossible. They based these refutes on mathematical models, claiming that an infinite transmission bandwidth would be required.
Continue on Page 2
Armstrong went on to prove that FM could be transmitted in a practical bandwidth by launching the first commercial FM broadcast stations. Among the advantages of FM are freedom from static, wide audio bandwidth, and the ability of an FM receiver to capture the stronger of two signals transmitted on the same carrier frequency.
Analog technology would rule the radio and television broadcast airwaves for the next 45 years or so, until digital technology was once again introduced into the modulation process. Meanwhile, amplitude modulation continued to evolve from purely analog techniquesincluding grid, ampliphase, Doherty, and high-level plate modulation to quasi-digital, pulse duration modulation (PDM)and finally to direct digital RF synthesis of the AM envelope (DX).
Hilmer I. Swanson, Senior Scientist and Harris Fellow, was responsible for inventing the most significant new AM technologiesincluding Harris' PDM and DX transmitters for broadcasting. These digital-modulation techniques offered much higher operating efficiency. In addition, they provided the modulation bandwidth and accuracy that were necessary to create the RF waveforms required for new digital broadcast systems.
Over-the-air, terrestrial digital radio and television broadcast still requires digital data, which represents the program content, to be converted into an analog format. That format must fit into a bandwidth-limited slice of RF spectrum. This transformation from digital data into an RF waveform representing that data is implemented by an RF modulation-demodulation (modem) process. The digital modulator converts data symbols into a vector-modulated RF waveform. That waveform can pass though the analog RF pathincluding power amplifiers (PAs) and filters within the channel bandwidth.
Although the transmitted RF waveform is analog in nature, the modulation process has moved completely into the digital domain. This transformation has made it possible to design all-digital, software-defined-radio (SDR) systems that can be instantly changed from one broadcast transmission standard to another.
Digital Broadcast Support
A number of infrastructure changes are required to support digital broadcasting. An example is peak-to-average power ratio (PAR), or "crest factor." Digital TV signals have a high PAR compared to a constant envelope signal. Digital-modulation waveforms including orthogonal frequency division multiplexing (OFDM) and eight-level vestigial-sideband (8VSB) formats contain complex, simultaneous, AM and PM modulation that requires linear RF amplification.
Such linear amplification is required for digital waveforms that have envelope variations. Essentially, it is needed to meet the RF emission mask and limit amplitude and phase errors that would degrade the modulation accuracy. Linear amplification also minimizes in-band RF intermodulation-distortion (IMD) products, which degrade the digital signal-to-noise ratio.
Digital pre-correction is another significant development for the digital broadcaster. One example is real-time adaptive correction (RTAC), which Harris developed to pre-correct the linear amplifier for AM/AM and AM/PM nonlinearities. A tradeoff exists between the level of in-band RF intermodulation (IMD) products that can be tolerated and how far into saturation the amplifier can go on envelope peaks. The effective use of RTAC can raise the linear amplifier's efficiency by running it closer to saturation. And the required digital signal-to-noise ratio and RF spectrum emission mask are still met.
New technology also makes it possible to modulate the drain voltages of high-power LDMOS field-effect transistors (FETs). The peak envelope power can then be tracked. In addition, the amplifier will be kept operating near high efficiency over a significant portion of the envelope depth. Linear amplification by means of Class "C" RF amplifier circuits developed by William Doherty in 1936 offers another efficiency enhancement. The process combines two or more amplifiers in a way that keeps one of the devices near saturation during all parts of the conduction cycle. The other device(s) operate in a linear mode for the bottom half of the envelope. Two-way, three-way, and N-way Doherty combined amplifiers have been investigated for ODFM and 8-VSB applications.
Analog To Digital Radio Transition
Generally, broadcast radio is considered the "last" analog medium. The switch from film to digital cameras, VHS to DVD, and NTSC analog television to ATSC DTV/HDTV all represent analog-to-digital transitions that are generally understood by the consumer.
Digital radio broadcasting (DRB)the generic term for all digital radio systemshas been in development since the early 1980s. Development began in Europe with the purpose of finding a more efficient digital replacement for traditional analog FM broadcasting.
Digital radio systems can provide multiple audio and data services on a single channel as opposed to the single-program-per-channel model used in traditional analog AM and FM radio transmission (i.e., one service per frequency). All DRB systems use some form of OFDM modulation of multiple carriers to produce the digital transmission. They also have the ability to subdivide the available digital bandwidth in separate functional data streams. DRB standards are now relatively mature. Yet they continue to evolve with greater capacity and ever-richer applications for media content.
Continue on Page 3
Eureka 147the first DRB systemgrew out of a European Union project in 1987. Eureka 147-based Digital Audio Broadcast (DAB)and more recently, DAB+ , DRM (Digital Radio Mondial), In-Band-On-Channel (IBOC HD-Radio), and Digital Multimedia Broadcast (DMB) systemshave all been developed to replace traditional analog, VHF FM radio.
DRM and IBOC systems also have been developed for medium-wave AM broadcast. DAB could be transmitted on the VHF FM band (87 to 108 MHz), but the services that have been introduced in Europe, Canada, and Australia are using Band III (174 to 230 MHz) instead, which was formerly used for analog television signals. Other countries, such as Germany and Canada, use the L-band (1452 to 1492 MHz).
DAB receivers on the market can receive both Band III and L-band transmissions. Each DAB multiplex occupies approximately 1.5 MHz bandwidth to provide around 1 Mb/s data capacity. A goal of industry and these governments is to see all radio broadcasting moved out of the existing FM band, which may then be auctioned to other potential users of the radio spectrum. Examples include broadband and wireless telecommunications service providers. Recently, a variant of DAB+ implemented the more efficient MPEG-4 HE-AAC v2 codec, allowing the system to carry more channels or have better sound quality.
HD Radio Broadcasting
The desire for a more gradual digital transition in the US and other countriesalong with the unavailability of alternate frequency spectrumhas led to the development of in-band on-channel (IBOC) digital radio technology. IBOC allows stations to transmit digital audio and data services within their assigned channel in their existing Band II allocations. HD Radio is the trademark for a hybrid IBOC technology developed by iBiquity Digital Corp. It was selected by the FCC in 2002 as the DAB standard for the US.
HD Radio refers to the method of transmitting an IBOC DRB signal centered onand to either side ofthe AM or FM station's current analog broadcast signal. The digital emissions fall within the authorized spectral emission mask of the existing AM or FM channel. HD Radio broadcasters can keep their current dial positions and take advantage of existing towers and transmitter sites.
A primary digital audio channel duplicates the analog program. In doing so, it allows compatibility with traditional analog receivers and fallback protection to analog mode in the event of digital signal loss on the digital receivers. In the future, the analog signal may be turned off, allowing for the eventual transition to a higher-capacity, all-digital broadcast.
The FM HD Radio system occupies about 200 kHz, providing 96 to 128 kb/s throughput that can be subdivided into multiple audio and data services. The AM digital system has a data rate of about 36 kb/s with auxiliary services provided at a much lower data rate. Both the FM and AM schemes use sophisticated compression techniques to make the best use of the limited bandwidth, providing near-compact-disc (CD) -quality (44.1 kb/s) audio on FM and FM-like quality audio on AM.
To an extent, Panama, Mexico, and Canada also have adopted HD Radio. It is being evaluated for adoption in more than a dozen countries including Brazil, Argentina, and China. HD Radio's backward-compatibility is attractive, as new digital radios pick up both HD Radio and analog signals along with improved data capabilities (Fig. 5).
Television Enters The Digital Age
June 2009 marked the end of the analog television era for the 1800+ full-power television stations in the US. There are still another 4500+ low-power, Class A and translator stations that must convert to digital over the next four years. Digital-TV conversion in other countries ranges from early planning to complete.
Many factors drove the television industry to convert from analog to digital. The quest for improvement of the television system in the US began with two prime drivers. The first was the demonstration of analog HDTV by the Japanese in the early 1980s. Although the Japanese system was analog, it required additional RF bandwidth to transmit. The second was pressure from the land-mobile industry to take RF spectrum from the broadcasters to support mobile communication services.
Both the broadcast industry and the government took an interest in HD technology. Congress directed the FCC to check into HD and the spectrum requirements. This action led to the FCC creating the Advisory Committee on Advanced Television in 1988. Former FCC Chairman Richard Wiley was chosen to lead the committee.
Continue on Page 4
Without delving into the next seven years of politics and industrial intrigue, the Advisory Committee looked at a myriad of analog and hybrid solutions to fit HD into a 6-MHz RF channel. By 1993, General Instruments had proposed an all-digital system. Others soon followed. The proponents of the various digital systems had a chance to test their technology at an industry-created Advanced Television Test Center. The test-bed system was designed and built by Harris.
Eventually, Wiley persuaded the six proponents to join together in a Grand Alliance to bring the best of each system together to form an all-digital US system. Once built and tested, the system underwent standardization through the Advanced Television Systems Committee (ATSC), which had been created to document and standardize new technical developments in broadcast television.
The US all-digital television system that came from the Grand Alliance became known as the ATSC system. HD was the original driver for DTV. But the ATSC system was architected around a packetized transport stream that could support multiple channels of HD and SD content as well as file-based content delivery.
The system uses 8VSB modulation to fit the 19.4-Mb/s payload into a 6-MHz RF channel. The 8VSB system was developed by Zenith with Harris conducting early over-the-air tests. MPEG-2 compression technology enables the more than 1.5 Gb/s data required to support HD within the 19.4-Mb/s transport stream.
The transmission of the 8VSB signal set a new level for RF design technology. Creating the RF signal required new techniques that included high-speed digital-to-analog conversion and high-stability, low-phase-noise direct digital synthesis. Amplifying this complex signal from milliwatts to tens of kilowatts of power also presented new challenges in amplifier linearity, adaptive correction techniques, out-of-band-emission suppression, and more. During the late 1990s and early 2000s, Harris RF engineers were busy developing and inventing new methods for transmitter improvement. One of these was the aforementioned RTAC.
While US engineers were busy developing DTV for the North American market, the same process was repeating itself in Europe, China, Japan, and Brazil as these locations jumped on the digital bandwagon. It became apparent that software-defined modulation techniques were necessary to keep up with the many evolving digital-transmission standards. The power of SDR technology is such that Harris ultimately reduced its exciter platforms from nine down to one.
These advancements continue today, even after much of the world has adopted digital-television transmission. Second-generation systems are beginning to appearmany of which employ significant technology changes. Products developed around SDR technology can respond to these changes rather quickly without starting over on the hardware platform.
Mobile DTV Enters The Picture
Upon development, the primary drivers of digital television were HD delivery, multichannel capability, and spectrum efficiency. The world population was just beginning to get hooked on cellular telephones and laptop personal computers (PCs). Devices like smartphones, personal digital assistants (PDAs), and tablet computers were not even known to the general public. Yet people soon desired to take their communications and entertainment with them as they became more mobile. As mobility developed around second-generation (2G), third-generation (3G), fourth-generation (4G), and WiFi connectivity, US broadcasters made it known that they wanted to participate in the mobile delivery of media content.
Wireless carriers have had mixed-to-poor results using their cellular technology to deliver media content. Large amounts of bandwidth are required to support real-time delivery. Cellular systems work on a one-to-one connection basis, making their use of spectrum very inefficient. We have all heard the stories of events where large crowds of people wanted to receive video on their wireless platforms. Unfortunately, they ended up bringing the systems down by oversubscription to individual cells.
Spectrum efficiency can be measured in many different ways. These measures must take into account more than just the number of bits transported per hertz of bandwidth consumed. Other factors, such as the total number of users served per bit transported and the percent of time that the spectrum is used (duty cycle), also must be considered in the overall spectrum efficiency of a given service.
The broadcast model supports one to an almost unlimited number of users without impacting the amount of bandwidth required to support the broadcast. Broadcast is the most spectrum-efficient use model available.
There were several proposals in the US about the technology needed to add mobility to ATSC broadcast television. Ultimately, tests and trials led broadcasters to select a system developed by LG Electronics, Zenith Labs, and Harris as the physical-layer basis to add mobility to the ATSC DTV system. The ATSC system is a single-carrier, 8VSB modulation-based system that is easily disturbed by Doppler, burst noise, and rapidly changing signal reflections.
The ATSC Mobile DTV (MDTV) system utilizes Internet Protocol (IP) as the transport medium and advanced video codec technology for low-bit-rate video coding. The mobile IP payload is encapsulated into the MPEG-2 transport used in the main ATSC broadcast system. It is then time-sliced into the main transport stream. The signal's mobile portion is wrapped in additional coding and error correction. It is provided with training signals, which enable the demodulator chip to more easily demodulate the signal's mobile portion.
The system borrows bits from the main ATSC transport stream in 917-kb/s increments. System efficiency varies between 37%- to 17%-based payload delivered to the receiver, depending on the level of coding applied to the mobile portion. At the same time, the system's threshold of viability (TOV) improves from standard ATSC at a 14.7-dB carrier-to-noise (C/N) ratio to levels as low as a 3.5-dB C/N ratio. This system gain helps to overcome antenna limitations at the mobile receiver.
ATSC MDTV is just one of several standards that are used around the world for mobile DTV. Other standards in Japan, Brazil, China, and Europe are either in use or developing. Although the future is still being played out, it is certain that mobile DTV will play an important role in providing emergency alerting and information during times of disasters. Time and time again, broadcast infrastructure has survived when the cellular infrastructure has failed during a disaster event. The recent earthquakes, hurricanes, and tornadoes in the US and around the world all have shown how broadcast came through when other systems failed.
Commercial radio broadcasting has been around for 91 years and commercial television is going on 65 years. Before television hits its 70th year, all TV transmissions in the US will be digital. In addition, a large percentage of the transmissions in the rest of the world will have transitioned to digital.
Radio is another story. In its many forms, digital radio will continue to grow in the number of deployments. Yet basic analog AM and FM radio has several factors that will perpetuate analog for many years to come because the transition to digital is not government mandated. For example, there are a large number of analog radio receivers around the world and analog radio receivers are simple to build at a very low cost.
Digital TV brings new and exciting features like HD, 3D, streaming, multichannel and mobility, and interactivity. In contrast, digital radio promises improved sound quality, multichannel program services, and data delivery with interactivity.
As a result, consumers have more choices and richer services while broadcasters can use their allocated spectrum in new and exciting ways. The wave of the future will be hybrid systems that combine the spectrum efficiency of broadcast with near-real-time, on-demand delivery of audio and video content. But that is a story for another day.
Geoff Mendenhall, VP Transmission Research and Technology, [email protected],
Jay Adrick, VP Broadcast Technology, [email protected], and
Tim Anderson, Manager, Strategic Radio Market/Product Development, [email protected]
Harris Broadcast Communications
1025 West NASA Blvd.
Melbourne, FL 32919-0001