PC-Driven Test Instruments Pick Up Speed

PC-Driven Test Instruments Pick Up Speed

The trend toward more integrated test instruments has improved performance at the high end while spawning a market for low-cost, portable, and PC-driven instruments.

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Bucking the trend of rising costs, clever test-and-measurement designers have been successfully exploring methods of reducing costs and still providing quality performance instruments. For example, a new market for test instruments is being created by the separation of an instrument’s processing/control and measurement modules. With the accelerated performance of personal computers, it also has proven viable to create PC-driven instruments that interface through high-speed and common platform-interface technologies.

Fig. 1
1. In what used to be the size of a standard Flash drive, some compact spectrum-analyzer manufacturers are enabling spectrum sensing to 8.15 GHz. (Courtesy of Triarchy)

Until recently, there were several technological barriers to using PC-driven techniques for high-performance test-and-measurement instruments. One such restriction was the inability to rapidly transfer data through a common interface platform from a sensor to a PC. Here, speeds had to be fast enough to provide adequate fidelity. Additionally, CPU processing speeds, physical memory, and random-access memory (RAM) were not up to the speed or capacity challenge of handling professional-grade instrument data output. These factors would lead to a large bottleneck as attempts were made to handle the data from the instruments.

In addition, limitations were placed on the processing of data once it was transferred. Another challenge associated with portable platforms was providing enough power for both the test instrument and PC hardware. In the mid-2000s, a few components could use the power and data rates offered by the technology, such as those based on USB 2.0. Yet network analyzers (NAs) or spectrum analyzers (SAs) with enough bandwidth or frequency range to compete with even low-end benchtop instruments were still inconceivable.

To solve many of these limitations with current hardware would require a proprietary interface technology and a specialized workhorse of a computer, mitigating the benefits of a PC-driven computer. In 2008, the USB-IF released USB 3.0, an update to the USB standard that enabled raw data rates of 5.0 Gbps and 900 mA of simultaneous power draw. The PC world adopted USB 3.0 as the standard for the latest computers, laptops, and even several tablet computers. As for its impact on test and measurement equipment, Matt Maxwell, Tektronix’s product manager of spectrum analyzers, notes, “USB-based instruments are no longer limited in terms of performance or analysis capabilities, making them a viable option for much more demanding applications than in the past.”

The low costs and high data capabilities associated with USB have supported the emergence of a variety of different applications. Examples range from USB test instruments to high-definition/high-speed imaging cameras and industrial sensing/control. Maxwell says, “Until the advent of USB 3.0, it simply wasn’t possible to move real-time signal information to a laptop or tablet computer fast enough. With highly capable quad-core processors like the Intel Core i7—along with solid-state drives (SSDs) and fast RAM readily available at affordable price points—today’s USB 3.0-equipped PCs and tablets represent capable platforms for supporting test-and-measurement applications.”

Portable and low-power processing, memory, and storage have all advanced, thanks to the growing demand for portable computers and the Internet-of-Things (IoT) connected environment. Meanwhile, higher-performance and smaller, solid-state memories and CMOS microelectronics have reduced both their size and power draw on computers. In fact, many of the latest laptops/tablets have nearly desktop-equivalent processing capability. The portability of heavy processing has enabled convenient on-site programing, processing, and data storage for large amounts of complex sensor data.

As computers became a standby in every household and office, the number of computer accessories and peripherals also grew. The size of these markets drove computer manufacturers to settle mostly on common interfacing, which has enabled PC-driven test and measurement instruments to advance more consistently. Maxwell says, “Test-and-measurement providers are experts at designing and building the actual signal-acquisition and analysis solution. PC manufacturers are experts at designing and building computer systems, and can do so more cost-effectively than test-and-measurement providers.”

Fig. 2
2. By offering modules for either USB or Ethernet-based connectivity, some PC-driven test and measurement manufacturers are taking a more modular approach to increase data rates for a variety of instruments. (Courtesy of Holzworth Instrumentation)

Take the use of commercial-off-the-shelf technology, which is increasingly being used by industrial and military organizations. With computers as well as other consumer products, modular components have decreased in cost and increased in performance at a rapid pace that proprietary technologies could not maintain. “Therefore, it makes economic sense for test-and-measurement companies to take advantage of the advances enabled by the PC industry,” says Maxwell. “By removing the requirement to build a front-end PC, test-and-measurement companies can offer instrumentation at lower price points and bring new products to market faster.” (Fig. 1.)

The benefits of PC-driven test-and-measurement instruments, such as cost, portability, and rich software potential, are attractive enough to encourage early technology innovators. Yet new techniques must be used to shrink the footprint of such high-performance electronics with a limited power budget and the necessary ruggedization. “USB instrumentation needs to be light, compact, and rugged,” explains Maxwell. “This typically requires all new board layouts and potentially new ASICs, along with close attention to heat management while ensuring a clean signal path. While USB 3.0 is fast, bandwidth limitations and power constraints can limit performance.”

Even with USB 3.0’s increased data rate, practical limitations prevent the interface from reaching the maximum 5-Gbps data rate. Practical data rates range to 3.2 Gbps (Fig. 2). Yet Maxwell says, “The bandwidth limitations can be overcome by using a faster interface than USB 3.0, such as USB 3.1, Thunderbolt, or even multiple connections.”

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

The Personal Touch

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

To ensure the adoption of PC-driven instrumentation, it is critical that a human interface allows an engineer or technician to quickly become familiarized with the instrument controls (Fig. 3). Of course, software-based controls offer great potential in terms of adding more customizability, features, and performance than a physical interface. The user, however, ultimately decides how an interface must be designed. Maxwell emphasizes, “Advanced capabilities in software are vital to the success of USB instrumentation. Without a front-panel display, software for controlling the instrument must be fast, intuitive, and easy to use. Usability studies show that productivity is similar across instruments that include hardware controls and software-only based instruments.”

Fig. 3
3. Because software-based display and control are key to PC-driven instruments, some companies are releasing software-development kits that allow for user-based programs to manipulate the data or instrument functions. (Courtesy of Pico Technology)

An added benefit of this class of instruments is that the software behind the instrument display is often subject to manipulation and even outright redesign by the user. Access to software development kits (SDKs) is occasionally provided by manufacturers of PC-driven instruments. It enables users to completely manage the processing and display of the data fed by the instruments. For large-scale manufacturing environments, this flexibility could be used to develop rapid or even automated testing processes that demand much less development investment than benchtop instruments.

Examples of USB 3.0-connected instruments include new portable and real-time SAs from Signal Hound and Tektronix. Signal Hound’s BB60C SA streams 140 MB/s of digitized RF spectrum from 9 kHz to 6 GHz, enabling an instantaneous bandwidth of 27 MHz at sweep speeds of 24 GHz/s. The Tektronix RSA306 SA captures signals from 9 kHz to 6.2 GHz with 40 MHz of real-time bandwidth down to -160 dBm. Both instruments come with a built-in software package and application programming interface (API) for customizable software and interfacing. The RSA306 also includes a Matlab driver for the API, which can readily connect with Matlab’s Instrument Control Toolbox.

Both of these devices are recommended for use with PCs that have an Intel i7 processor and 8 GBs or more or RAM, which are needed to process the data. Because a main design consideration of these devices is to lower end-user costs, USB 3.0 is solely used. Yet the addition of a compact computer with a network interface can transform these instruments into full-functioning portable test instruments with remotely programmable and controllable capabilities. In addition, Gigabit USB device servers enable 10/100/1000 Mb of switched Ethernet connectivity with higher data rates eyed for the future. They also may enable remote data and power connections to multiple USB devices.

Other wired connectivity options may soon become so common that it makes sense to use them as instrumentation interfaces. The most likely to impact PC-driven test equipment is USB 3.1, which increases USB 3 data rates to 10 Gbps and reduces the line-encoding overhead to 3%. That reduction in overhead results from the change to the 128b/132b encoding scheme. Although the data rate is crucial, the new USB 3.1 standard also will allow for multiple power levels: 10 W with 2 A at 5 V, 60 W with 5 A at 12 V, and 100 W with 5 A at 20 V.

This 2× to 20× increase in power would enable much more power-hungry and higher-speed digital-to-analog converters (DACs) and analog-to-digital converters (ADCs). It also would pave the way for better-performing low-noise amplifiers (LNAs) for greater sensitivity and even fully PC-driven and PC-powered radio peripherals. With up to 100 W and 10-Gbps connectivity on a single interface, a cost-effective multifunction instrument could possibly tear down the boundaries between separate test-and-measurement devices. (The USB 3.1 interface also flaunts a future-proof design, as it was designed with consideration of future speeds greater than 20 Gbps, according to USB-IF.)

Two main bottlenecks remain for PC-driven instrumentation: data rates and the adoption of interfaces that can transfer the data needed to enable high-performance instruments. Another path may be to move toward wireless interface standards, as the consumer market is demanding ever-higher bandwidths from wireless connections. Indeed, there have been significant increases in data rates and bit-error-rate (BER) improvements with wireless interface technologies. For example, Wi-Fi 60 GHz (IEEE 802.11ad) boasts speeds to 7 Gbps. But interference considerations may limit the adoption of these types of wireless technologies.

Fig. 4
4. Several non-transmitting instruments are both controlled and powered through a PC-driven connection, such as USB. (Courtesy of Vaunix)

Visible light communication, such as Li-Fi, is another emerging technology that could be free of electromagnetic interference. This technology uses visible light-emitting diodes (LEDs) to transmit and receive data. Current experimental rates range to 10 Gbps. Much faster speeds are theorized, due to the huge bandwidth availability and low interference in visible bands.

Buoyed by new developments, PC-driven test-and-measurement instruments will continue to overcome any doubt and hesitancy in the engineering community. Already, most major test-and-measurement companies have embraced PC-driven architectures for low- and mid-range network analyzers, spectrum analyzers, and other components (Fig. 4). “Based on these advantages, we’re confident that the trend toward professional-grade USB instrumentation will continue and have a significant impact on the market,” says Maxwell. “As faster interfaces, such as USB 3.1, become mainstream, look for more capable and higher-performance instruments to be introduced. Meanwhile, demand for remotely connected/controlled devices is increasing in the industrial and commercial environments, which may lead to more devices being equipped with either wireless or wired Ethernet capability.

“Ethernet connectivity may also be a possibility if there is shown to be sufficient need and demand,” Maxwell continues. “To be sure, the future is indeed bright for USB/Ethernet instrumentation.”

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

 

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish