Mwrf 2222 Networkingpromo 0

Wireless Standards Ensure Connectivity

March 30, 2016
Wireless standards have steadily evolved toward a worldwide wireless network—one that is capable of handling the massive amounts of data generated by modern users.
Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

So pervasive has wireless communications become in daily life, most people take for granted the effective operation of multiple wireless devices each day (including their cellular telephones and Internet-connected computers). But bandwidth is limited, and without wireless standards to set limits and guidelines, all of these different wireless devices would be competing for their “wireless space.” And only the highest-level signals would make the connection.

Fortunately, with the aid of regulatory organizations such as the Federal Communications Commission (FCC), frequencies and bandwidth are managed for effective co-existence of many different wireless communications devices and functions. By organizing frequencies into licensed and unlicensed bands and using different strategies for signal transmissions, order is maintained.

Image courtesy of Thinkstock

n theory, even devices with extremely low signal levels will function in the presence of much larger signals from broadcast stations if wireless standards are carefully designed and implemented. As the number of wireless devices and functions increases, they must fit comfortably within relatively limited bandwidth. This requires the use of radio transmission methods that will make the most efficient use of the available bandwidth.

In considering how commercial frequency-modulated (FM) radios are designed, with channels assigned to bandwidths from 88 to 108 MHz, the broadcast scheme is relatively inefficient for its number of “users” (channels) versus the number of modern cellular telephone users at higher frequencies. Early cellular communications systems were based on the analog (AMPS) technology, which was also applied to shorter range products such as cordless telephones.   

As the popularity of the communications format became apparent, more efficient cellular standards were needed for a rapidly growing number of users. The result has been a fast sequencing of new generations of cellular communications formats based on digital modulation techniques, allowing for more efficient use of available bandwidth and more users per cell site.

There has been a rapid succession of new cellular standards, extending from First-Generation (1G) analog technology, through Second-Generation (2G), Third-Generation (3G), and the current Fourth-Generation (4G) cellular standards, including Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A).

As mobile telephones themselves evolve into all-in-one tools for voice, data, and video communications over the Internet, service providers must scramble to enhance their wireless networks with increased capacity and performance—or at least, to the degree that cellular service providers and component/system designers are now debating with regard to the performance requirements necessary for a Fifth-Generation (5G) global cellular communication standard (see sidebar).

Each cellular communications generation has included its own set of wireless standards, which isn’t always compatible with the previous generation’s. 2G cellular systems included the specifications detailed in the Global System for Mobile Communications (GSM) standards developed by the Third-Generation Partnership Program (3GPP) and the first generation of the novel code-division-multiple-access (CDMA) networking technology developed by Qualcomm, known as cdmaOne.

Typically, an advance to the next cellular communications generation has involved a transition, with the “2.5G” transition employing wireless communications networks based on the next generation of CDMA, or cdma2000. Along with the Universal Mobile Telephone System (UMTS) based on the earlier GSM standard, served as wireless network standards for the growth of 3G cellular services.

Both would also help bolster the growth of cellular services through the 3.5G transition to current 4G systems, along with such other wireless standards as mobile WiMAX (IEEE 802.16e, EVDO-B, HSPA, HSPA+, and LTE). Critics of the original LTE standard pushed for the development of LTE-A during the adoption of 4G cellular services as a “true 4G cellular standard.”

In essence, all of these wireless standards have dealt with the same sets of tradeoffs: working to provide as much performance as possible within limited bandwidths, and for the lowest cost possible. The standards have differed in terms of multiplexing schemes, modulation, and waveform types, making use of the Internet Protocol (IP) in later generations of cellular communications standards for the fastest data rates possible.

Much infrastructure tuning has been necessary along the way. Picocells and microcells, for example, have been added inside buildings and other areas where coverage from a cell’s main base station might have been blocked, weak, or nonexistent.

The evolution of cellular communications standards has occurred along side of their closer-range wireless companions, the wireless-local-area-network (WLAN) standards. The most notable of these arethe IEEE 802.11 standards established by the International Institute of Electrical and Electronic Engineers (IEEE). As the development of the IEEE 802.11 Wi-Fi standards makes apparent, and as was well documented in a 2015 Microwaves & RF article on wireless standards, the trend in all wireless communications standards is to achieve higher data rates without increasing the consumption of bandwidth.

Perhaps the simplest way to think of different wireless communications standards is by how far they allow a communications link to effectively operate (at some established data rate) between a sender and receiver. Cellular radio standards support voice, video, and data communications around the world, while WLAN standards support communications in an area that may be as small as an office or a single room.

As formulated by the IEEE, the earliest version of IEEE 802.11—for WLANs in the unlicensed 2.4-GHz band—was expanded to two different-frequency versions in 1999: IEEE 802.11a at 5 GHz and 802.11b at 2.4 GHz. The difference in frequency and bandwidth for the two versions translated into differences in maximum data rate, at 54 Mb/s for IEEE 802.11a and 11 Mb/s for IEEE 802.11b.

For cellular or wireless network communications, no one standard has ever been enough, especially considering the different frequency assignments in different countries and the many varied requirements for different wireless applications. The development of each standard is truly an evolutionary process. The current dominant 4G standard, LTE, is based on GSM/EDGE and UMTS/HSPA wireless network technologies which were developed by the Third Generation Partnership Project (3GPP).

To improve upon the initial LTE standard, the ITU-R organization developed the higher-performance LTE-A version for 4G cellular systems. The goal of LTE, which is a trademark owned by the European Telecommunications Standards Institute (ETSI), was to improve upon 3G cellular standards with increased capacity and data transmission speeds. It uses digital signal processing (DSP), advanced modulation, and a packet-switched, IP-based approach to reduce the data latency hat plagued 3G systems. 

LTE, which is incompatible with the earlier cellular standards, uses both frequency-division-duplex (FDD) and time-division-duplex (TDD) techniques as did UMTS before it, either sending or receiving at different frequencies or at the same frequency and at different times. It achieves peak uplink rates to 75 Mb/s and peak downlink rates to 300 Mb/s, with data transfer latency of less than 5 ms. It operates on scalable bandwidths from 1.4 to 20.0 MHz on licensed frequency bands.

Standard LTE comes in several versions, including LTE-TDD (which uses a single frequency, downloading and uploading data at different times) and LTE-FDD (which uses pairs of frequencies for uploading and downloading data). In addition, LTE Direct is a device-to-device wireless technology developed by Qualcomm for use in many countries, including the U.S, Japan, Italy, China, Sweden, and Australia.

Ultimately, he public’s appetite for wireless services has made clear the need for 5G technology and a new set of wireless standards—perhaps even looking to millimeter-wave frequencies for available bandwidth.

What Will it Take for 5G?

To be economically feasible, a new cellular standard such as 5G must have much greater capacity and much higher peak data rates than 4G in order to make the investment in a new wireless standard (and its infrastructure equipment) worthwhile by service providers. The 5G focus will be on providing more broadband services, including video streaming, on portable wireless devices. To accomplish the improvements in performance, network design may involve reducing the sizes of cells to create more dense cellular networks compared to earlier cellular networks.

Various market forecasts call for a build-out of 5G cellular networks from 2017 through 2025. For example, the ITU-R has announced plans to release its IMT-2020 5G standard sometime around 2020. During this time, 5G will have moved far beyond the functionality of earlier cellular networks and will include the Internet and Cloud-based data storage as essential parts of the network, along with Internet-of-Things (IoT) networking, and communications between both humans and machines and machines to machines (M2M).

It is clear that 5G cellular networks will rely heavily upon cognitive radio technology, such as software-defined radios (SDRs), to employ communications devices that can shift function and performance according to data rates and operating conditions. These networks may use duplex systems that vary the use of frequency or time, rather than just FDD or TDD, and they will no doubt use multiple antennas in multiple-input, multiple-output (MIMO) configurations to improve distance, coverage, and data rates. The antennas themselves will likely be advanced “smart” antennas capable of altering beam directions electronically for optimum reception.

Finally, 5G cellular standards will most likely need to reach beyond the traditional frequency ranges of about 800 through 3,800 MHz for transmission and reception, and secure such links as backhaul connections within the network by means of millimeter-wave frequencies (around 50 GHz). With data growing through wireless services and additional applications such as IoT devices, bandwidth is invaluable; at least one challenge of 5G systems will be in making millimeter-wave links more practical.

Looking for parts? Go to SourceESB.

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Sponsored Recommendations

UHF to mmWave Cavity Filter Solutions

April 12, 2024
Cavity filters achieve much higher Q, steeper rejection skirts, and higher power handling than other filter technologies, such as ceramic resonator filters, and are utilized where...

Wideband MMIC Variable Gain Amplifier

April 12, 2024
The PVGA-273+ low noise, variable gain MMIC amplifier features an NF of 2.6 dB, 13.9 dB gain, +15 dBm P1dB, and +29 dBm OIP3. This VGA affords a gain control range of 30 dB with...

Fast-Switching GaAs Switches Are a High-Performance, Low-Cost Alternative to SOI

April 12, 2024
While many MMIC switch designs have gravitated toward Silicon-on-Insulator (SOI) technology due to its ability to achieve fast switching, high power handling and wide bandwidths...

Request a free Micro 3D Printed sample part

April 11, 2024
The best way to understand the part quality we can achieve is by seeing it first-hand. Request a free 3D printed high-precision sample part.