It’s not just happening with consumer technology: Even high-end test and measurement instruments are beginning to fuse into Frankenstein-like combinations of classical instruments. Just as Apple’s new Apple Watch is an amalgamation of a time-piece, speakerphone, fitness hub, multi-pass, GPS, MP3 player, and possibly your next girlfriend, traditional RF/microwave instruments are getting smaller/portable and more feature rich.
Over-the-air upgrades are also a trending option, as well as instruments ditching the box and embracing PC-driven connectivity. Recently, several test-and-measurement companies have released box instruments—specifically oscilloscopes—with built-in SA functions or even SA modules incorporated into the box. Also, there are fusions of NAs and SAs in individual modular instruments with software-defined functions that enable full PC-driven transceiver tests.
Interestingly, this instrument melding is creating many new test opportunities to lower testing time, cost, and footprints. A major enabling factor is increased speed and more compact components with the ability to digitize RF/microwave signals with custom designed application-specific-integrated-circuits (ASICs), and even commercially available analog-to-digital (A/D) and digital-to-analog (D/A) converters.
One significant challenge these compound instruments may be able to meet involves the extremely complex testing of high-bandwidth wireless standards and emerging electronic warfare (EW) applications. The bandwidth requirements are extremely high for these applications. Plus, the latest wireless standards incorporate features—such as carrier aggregation, MIMO, digital-predistortion, and envelope tracking—that traditional spectrum analyzer technology isn’t able to keep up with (especially at millimeter-wave frequencies).
With 5G coming up, possibly all the way up in the 60 GHz spectrum, there is a race for bandwidth in SAs with the necessary performance to meet these highly complex modulation schemes. Oscilloscopes with sophisticated FFT functions could bridge the gap, even more so by advancing the ability of using processing algorithms to more accurately recreate the frequency domain data. This type of solutions are more likely in a modular or PC-driven architecture, as box instrument computation capabilities likely won’t be able to keep up with the pace of growth of signal complexity and the demand for even greater computing.
There have been PC-driven oscilloscopes, network analyzers (NAs), and spectrum analyzers (SAs) for several years. Generally, these instruments were relegated to the low-performance range and were only helpful for a select few applications. With the advancement of a high-speed serial bus, USB 3.0 at 5 Gbps, it became viable to do the initial digitization and processing of an RF/microwave signal in external hardware and transport the several gigabits per second of data to a powerful PC over USB. Additionally, commercial PC hardware has stepped up with affordable Intel i7 processors that can crunch a considerable amount of streaming RF data.
I am inclined to believe that with the release of USB-C, USB’s new 10 Gbps standard with power-over-data-line technology, a new regime of PC-driven instruments will become possible. Yes, the USB-C standard also supports up to 100 watts of power over the same 10 Gbps interface, with a fully symmetric cable even an undergrad couldn’t mess-up. I am unsure of how well USB-C could be used to sync several modular instruments, but I suspect that it may be up to the task.
Instantly I thought of combining USB-C’s high-speed data transfer mechanism with a RF/microwave calibrated digitizer, throwing in a GPU-based computation engine for good measure. You would have an incredibly powerful PC-driven instrument that could function as a high-speed mixed-signal oscilloscope and advanced SA. You would only need a calibrated and synced generator, which could even be a separate component, and you would have just created a nearly-complete RF/microwave test-bench for a fraction of the cost of a box instrument.