Carrier-To-Noise Generators Enhance BER Testing

These communications test instruments offer increased accuracy through the careful control of potential causes of measurement uncertainty.

Carrier-to-noise (C/N) generators continue to be essential ingredients in evaluating the performance of receivers employed in CATV, cellular, satellite communications, and WLAN systems. These instruments essentially combine a carrier supplied by the user with internally generated additive white Gaussian noise (AWGN) source to produce an output signal that contains a precise ratio of carrier and noise.

They are particularly well suited for BER, SINAD, and channel-impairment testing, in which they can help determine a receiver's ability to perform under a wide range of signal conditions. C/N generator architecture has changed little over the years, while the expected performance of communications systems has dramatically increased. To address the need for greater instrument capability, accuracy, and reliability, dBm Corp. (Wayne, NJ) has created the CNG Series C/N generators using an architecture that delivers significant improvements in performance, ease of use, and long-term reliability.

The CNG Series (see figure) is a family of instruments with models targeted at intermediate-frequency (IF) applications (typically 70 and 140 MHz) and RF applications from 800 MHz to 6.0 GHz (with specialized models covering frequencies to 30 GHz). The frequency bands of the standard models correspond to the interference conditions required by applications such as satellite or cellular/PCS communications, WLAN, and CATV, and the instruments are compatible with all analog and digital modulation schemes. Fully automated operating modes include C/N, C/No, EbNo, and C/I modes. The instruments feature noise bandwidths from 50 to 180 MHz.

The instruments can be used in benchtop environments or as part of an automatic-test-equipment (ATE) system for production testing. In the latter application, the use of solid-state attenuators extends the service life and reliability of the instrument and increases execution speed, key considerations in high-volume production testing. The front-panel controls include separate keys to invoke each main function with LED indicators and a bright LCD display to provide a clear indication of all settings.

The CNG Series instruments are available with one or two independent RF channels, for simplex, diversity, or duplex testing. As many as 10 test scenarios can be stored in memory, and the instrument will automatically measure the carrier signal and precisely adjust the power of the internal noise source to maintain the desired ratio. External interference signals can also be injected in place of the internal sources, for measurements of carrier power, noise power, and interference power, and set the C/N and C/I ratios.

The CNG Series instruments employ a thermal termination as the noise source, rather than the traditional noise diode, eliminating diode-based amplitude distribution errors and providing a high crest factor with no asymmetry in noise-voltage distribution. Compensation is automatically provided for bit rate, signal bandwidth, duty cycle, and power-level settings, and input-signal variations can be automatically tracked and negated to ensure the desired ratio is maintained. The instruments downconvert carrier signals to 140 MHz, measure the power at that frequency, and then add a precise level of noise. The signal with noise is then upconverted to its original frequency. By performing the power measurement and noise addition at IF, very high accuracy can be achieved.

Ratio accuracy is the most important specification for a C/N generator, since it has a direct and dramatic effect on UUT performance. For example, if a 64QAM system is being tested for BER and the C/N ratio uncertainty is a seemingly insignificant ±0.5 dB, the measured BER could nonetheless be inaccurate by a factor of more than 10,000. Consequently, it is imperative that ratio accuracy be as precise as possible—and verifiable. The CNG Series achieves verifiable ratio accuracy of less than 0.2 dB (and typically 0.1 dB) by compensating for every possible source of error.

Errors can arise from variations in noise-density and signal-path flatness, step-attenuator resolution, impedance mismatch, and the method and test equipment used for calibration. To minimize all sources of potential error and ensure ratio accuracy, dBm uses various compensation techniques, and overall ratio accuracy is measured in every instrument over the operating frequency and input power range.

In order to maintain a precise level of Eb/No over frequency, the instrument must account for signal-path gain variations and noise-density variations over the entire operating frequency of the instrument rather than at a single discrete point. The CNG Series accomplishes this by compensating for noise density and signal gain at all frequencies in 1-MHz increments. For signals that occupy a large bandwidth, there is also a possibility of additional C/N uncertainty caused by variations in the signal-path gain and the noise spectral distribution flatness. The instrument also accounts for these effects, utilizing its internal calibration factors and the bit-rate parameter value.

The CNG Series employs the substitution method for establishing ratio accuracy, which relies on the accuracy of the noise attenuator and its resolution. The substitution method eliminates errors caused by power-measurement linearity. However, since accuracy is directly determined by the attenuator (perhaps the most critical component in the instrument), it is essential that it have the greatest possible resolution and least error throughout its attenuation range. To achieve this, dBm created its own attenuator design, which has resolution of 0.016 dB, and is compensated in 1-MHz steps at every attenuation setting with a dedicated microprocessor. All attenuators are calibrated and verified over their range of operating frequencies and attenuation. This technique reduces resolution uncertainty to less than 0.008 dB and typical error to less than 0.02 dB.

Similar attention was paid to the internal power meter, which to reduce error must provide compensation for hardware inaccuracy over the full input power range. To eliminate this problem, the CNG Series instruments employ a dedicated microprocessor to provide the compensation, and the user can specify the sampling rate and number of samples. Signal-path gain is also well controlled. Input-to-output gain is typically better than 0 dB ±0.1 dB, and calibration is performed in 1-MHz steps over the instrument's frequency range.

One of the largest contributors to error is mismatch uncertainty. While a return loss of 20 dB is more than adequate for most applications, it results in a mismatch uncertainty of nearly ±0.1 dB. Good matching and careful component selection in the CNG Series help achieve return losses in excess of 26 dB, reducing mismatch uncertainty to less than 0.02 dB.

Calibration of each instrument is fully automated and executed by a single program, and the procedure is performed with software-correction factors resident in the instrument. The instrument is calibrated in 1-MHz increments using a precision reference tunable frequency converter, which ensures that all power measurements are made at 50 MHz, the frequency at which power sensors are typically calibrated, eliminating power-meter uncertainty versus frequency. There are 65 calibration factors collected and stored for each 1 MHz of operating bandwidth, which translates into more than 8500 calibration points for a typical instrument. dBm Corp., 6 Highpoint Dr., Wayne, NJ 07470; (973) 709-0020, FAX: (973) 709-1346, e-mail: [email protected], Internet:

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.