Signal identification is critical to many local and federal government groups for security, law enforcement, and intelligence applications. With the growing congestion of complex digitally modulated signals used in wireless systems, especially in urban environments, signal identification and analysis becomes even more challenging in the field, requiring the right mix of measurement equipment and expertise to properly separate and identify the many different signals within a given band.
Working in the field to identify signals does not afford the same convenience of a large, fixed installation as in the laboratory. Of course, some approaches to performing digital signal analysis in the field involve simply using the measurement equipment that was designed for the laboratory. But this method is limited in mobility and effectiveness, favoring the use of equipment and techniques that were designed for portability.
Identification and analysis of unknown wireless signals requires a methodical procedure to minimize errors when tracking and isolating signals. The first step in finding a signal is to search for RF energy at some designated physical location. Measurements with a signal or spectrum analyzer over a bandwidth of concern that show signals at significant power levels are candidates for further analysis. That power level is set at some threshold point that is deemed to represent potential interference to known RF/microwave and other systems (such as computers) in the area.
Signals that are constant (always present) are the easiest to detect and log using a spectrum analyzer. Signals that are intermittent are more difficult to detect, but can be found by using a spectrum analyzer in "max hold" mode or "spectrogram mode" for a duration that is long enough to allow capture of the intermittent signals. When looking for new signals, the trace mathematics contained on many modern spectrum analyzers can help. Finding new signals trace math involves using a reference waveform recorded earlier as a baseline signal for comparison. Any newer waveforms in question are subtracted from the reference waveform to produce a difference waveform. Any dips in this difference waveform indicate signals that are no longer present, while humps or rises in the waveform show new signals.
Mask testing is a useful way to spot new signals (Fig. 1). Any signals recorded at a specific location in the past can be used to create a mask for comparison to newer signals. If there is no alarm in the comparison, the newer recorded signals do not contain any new waveforms of interest. If a mask violation occurs, the trace is logged for further investigation. Instruments with automatic mask creation can greatly speed and simplify this signal hunting procedure.
Another way to log data is to use a spectrogram, which is a three-dimensional display of frequency, power, and time. One implementation of a spectrogram shows signal power levels in different colors and time on the vertical axis. When paused, a user can scroll through the spectrogram's time axis and view the corresponding frequency versus-time trace in terms of a frequency-versuspower trace within the same display screen (Fig. 2).
The spectrogram is an effective way to spot weak signals that otherwise would be buried in the noise and also a way to easily see intermittent signal behavior that comes and goes on a spectrum trace. The intermittent signals in the example spectrogram of Fig. 2 are easy to spot, since they appear as vertical columns that look like dashes. On a conventional frequency-versus-power display, they would come and go quickly, making them difficult to see. With a spectrogram, operators can scroll through the captured data, analyzing it at leisure.
When using a spectrogram, it is particularly important that the host spectrum analyzer employ Fast Fourier Transform (FFT) signal analysis capability. An FFT-based analyzer captures a wide portion of the frequency range at once, allowing users to see the true shape of the signal even when it a burst-type or frequency-hopped signal. A conventional swept-frequency spectrum analyzer will show instrument artifacts instead of the bursted signal when in spectrogram mode.
It is possible to use a spectrogram as a time index, to simplify a review of logged signal data. One useful technique for long-term spectrum monitoring is to set the spectrogram trace function to max-hold (to catch bursted signals) and slow the trace update rate. Since the max-hold is reset every time the spectrogram is updated, it is still possible to see signals change, and the spectrogram's time stamp makes this technique even more useful.
While signal logging, spectrograms, and spectrogram logging are useful in finding unknown signals, it is also helpful to have an organized way to keep track of all the signals that are being monitored and recorded. When searching through a large number of signals, it is important to simply be able to identify a signal as one that has come before, or one that is not yet identified. This is where a signal survey has benefits.
On a spectrum analyzer's signal survey screen, the short bars near the top of the screen as well as the blue box, denote spectral regions of interest (Fig. 3). The bars are the collapsed form and the box is the expanded form. These regions of interest can be created quickly to facilitate locating and identifying signals of interest.
Once RF energy is found and indexed, signals can be screened. The first level of screening will identify RF characteristics such as frequency, channel, bandwidth, and shape. Many signals can be quickly dismissed by a skilled operator one familiar with the RF signals to be found at that location. For analog signals, there are a number of established techniques to better understand if they are legitimate.
One technique that combines all four of these first-level measurements is a signal profile test driven off a database stored in the internal memory of a spectrum analyzer (Fig. 4). The signal profile, shown above in green, incorporates the different RF screening elements discussed above. The profiles available for selection are based on what "should" be present at that location and frequency. Using these profiles, the signal can be evaluated against a series of clues to determine legitimacy.
The shape of the profile is the first clue. If the signal matches a profile, this is evidence that the signal is what it appears to be and is legitimate. The second clue is that the profile will be centered on a legal channel for the signal chosen, making any frequency error or channelization errors apparent. The signal profile test combines the elements of frequency, channels, and shape, as well as any available information collected from other investigations. When using a signal profiling tool, it will be apparent if a signal of interest or a similar signal has been scanned before. The signal profiling function is an effective first-pass screening tool for field use.
Continue to page 2
If a signal has failed screening, further analysis will allow a better understanding of that signal. For example, digitally modulated signals contain periodic information such as time slot duration, symbol rates, chip rates, and hopping frequencies. Using a signal correlation density (SCD) algorithm, the periodic characteristic unique to a particular signal can be extracted, measured, and compared to a reference in order to classify a particular signal of interest. Depending upon the depth of the analysis, a measurement system employing an SCD algorithm can further distinguish between different signal types within a signal family. The end result is a digital fingerprint of the signal.
An SCD algorithm is a measure of "power" at various frequency offsets within a signal. The cyclic frequency is the distance a signal component is from the center frequency of the signal (Fig. 5). Signal components are also called tones or sine waves. It is possible to statically measure the match of the cyclic components of a digitally modulated signal to a particular wireless communications standard. The percent difference between the expected frequency offset and the measured frequency offset is called the alpha error and is measured in percent. Low alpha error values are considered a good match between an unknown signal and the standard's representation of that signal.
The amplitude of the tones, or cyclic components, can be measured statically against expected values with a coherence match measurement. A coherence match is the ratio of differences between expected and actual SCD amplitude values at given alpha offsets, or a measure of power at a given frequency within the signal. A coherence match function makes it possible to measure how well the internal rates of specific digital signals conform to expected values.
Once a signal has been identified as a threat or as questionable, often the next task is to locate the signal emitter. For this task, a broadband spectrum analyzer designed for signal hunting is very useful. This spectrum analyzer should include various signal hunting tools, such as a signal strength meter, a spectrogram, and most important, the ability to map signals onboard to facilitate analysis of the more difficult signal hunting tasks.
A sensitive spectrum analyzer with a signal strength meter is one of the first tools required for signal hunting. The meter produces a tone available by speaker or headphones that varies in pitch with the strength of the signal. Using such an instrument is a field-proven method for operators to search for a signal while following the surroundings and not a spectrum-analyzer display screen. Such a tool allows operators to correlate where their antenna is pointing with tones. They can simply drive or walk around while listening to the tone, looking for the higher pitch. Unfortunately, some signals are more difficult to find. Conditions of low signal strength, multipath effects, or other signal distortion, can require more advanced techniques. In such cases, mapping the signals can be very useful and will help to untangle the complicated situations.
The first type of mapping involves traces or measurements recorded on the map at the operator's current Global Position System (GPS) derived location. At the same time the signal is recorded, an operator can draw a vector, shown in red, to document the bearing of the signal (Fig. 6). This is a quick way to resolve complexities caused by reflections or obstructions. Once data collection is complete, the traces can be reviewed and the data analyzed without any need to return to a vehicle, office, or laboratory. An alternative form of mapping uses a GPS-driven spectrum analyzer to automatically map and record signals as the operator moves around. In this case, the color of the icons indicates when a signal approaches an operator- defined limit. This is a quick way to find signals when the RF issues are complex.
A third form of signal mapping is useful inside buildings where GPS will not work. In this mode, signals are recorded on a building floor plan automatically as the operator walks around the facility. This method borrows a technique from drive testers, but asking for a start and end point of the walk and automatically placing measurements in between. Any of these mapping techniques provide tools to collect, analyze, and document data in the field.
Of course, not every signal can be identified in the field. New signals, modified signals, signals from damaged equipment, and sometimes unknown or unusual signals exist in every part of the world. Since modified- from-standard signals are likely suspects in the signal hunting game, it may be worthwhile to capture a signal sample for further analysis in a laboratory. Given sufficient acquisition memory, a spectrum analyzer can capture a frame or more of most modern digital signals. If captured in the right format, the data can be analyzed offline by software tools such as the personal computer (PC) version of a real-time spectrum analyzer (Fig. 7) or by using analytical tools such as Matlab software from The Mathworks.
As RF communications systems evolve to the use of newer and more efficient techniques, many of which involve advanced digital modulation formats, sorting out suspect signals from legitimate communications signals becomes more and more difficult. Separating illegitimate RF signals from legitimate signals is the first step. And since signals vary with time, signal logging is often necessary for this task. Logging can be triggered by a time interval or by signal mask violations. Logging in a spectrum analyzer's spectrogram mode allows rapid indexing of traces and provides a quick way to spot intermittent or weak signals.
Once the signal has been spotted, it can be judged against a set of parameters. Power levels, frequency, channelization, shape, location, and timing are all significant. Dynamic behavior, such as bursting or hopping is also significant. It is also possible to look inside a signal to better understand the signal's internal rate structure using an SCD measurement. This measurement checks the signal's internal data rates, which are difficult to disguise and likely to be affected if the signal is not what it seems.
If a suspicious signal has failed all these tests, it is likely time to locate the signal. A broadband spectrum analyzer with signal hunting tools, as well as integrated mapping, can help in this task, whether the signal hunt is taking place inside or out-of-doors. Finally, if a signal really is new, it may be wise to capture a sample of the signal for later analysis. Saving the unknown signal for categorization can help save time during future signal hunting episodes. Off-the-shelf software, as well as a number of development packages, is available to help with this task.