Image

Calibration Can Make All the Difference

June 30, 2016
Calibrating RF/microwave test instruments may seem unnecessary, but it can make a huge difference in the quality of a company’s products as well as the company’s reputation.

During lunch recently with Ulrich Rohde of Synergy Microwave Corp., the topic of testing different components came up—in particular low-noise oscillators. Customers and visitors to Synergy know that the company is as well equipped with test gear as any small company in the RF/microwave industry, even to the extent of installing a Faraday cage for measurements isolated from the plethora of electromagnetic interference (EMI) and radio-frequency interference (RFI) around us. But Ulrich pointed out that having all the test equipment was no advantage unless the equipment was properly calibrated.

Instruments that are out of calibration will provide inaccurate test results. In production testing, this can result in poorly performing components seemingly meeting their performance requirements, then being shipped to a soon-to-be unhappy customer. It can also resulted in perfectly good products apparently failing to meet their required specifications and being subjected to unnecessary rework.

Calibration is a means of ensuring that a test instrument is performing within its required limits, typically by comparing it to a reference standard or a similar instrument that is at least 10 times more accurate than the instrument under calibration. Calibrations of certain types of analyzers, like microwave vector network analyzers (VNAs), can also be performed with calibration kits available from a number of suppliers. These kits typically provide the passive components to perform a short-open-load-through (SOLT) calibration of a VNA prior to making measurements in the laboratory or on the production floor, but they do not equate to the instrument calibration performed by qualified test labs.

There is no single acceptable calibration cycle, such as an annual calibration. It depends on the instrument and the schedule that the instrument’s manufacturer has recommended. Newer test instruments include self-calibration functions that provide a certain amount of correction for components within a test instrument, such as oscillators, that will drift over time. Such functions can extend the required calibration cycle of an instrument, within the limits of the self-calibration circuitry’s correction range. One excellent resource is “Setting and Adjusting Instrument Calibration Intervals,” an application note from Keysight Technologies.

Some test equipment owners may feel that calibration is a “necessary evil” and that the test gear should have been better designed and built in the first place to eliminate the need for calibration cycles. Companies testing instruments used to qualify products for military customers may require calibration to ANSI/NCSL Z540-1/MIL-STD-45662A requirements and may bemoan the time that their instruments are away due to the need for calibration. Such calibration procedures are quite detailed and must be traceable to standards set by the National Institute of Standards and Technology (NIST) for a particular type of instrument. For most companies, this is best left in the hands of firms specializing in such calibration services.

Admittedly, the need for calibration may remove an invaluable instrument like a VNA or digital storage oscilloscope (DSO) from a “well-oiled” production line. But the time spent in calibration is time well spent compared to time lost from unnecessary rework on perfectly good products, and the possibility of shipping poorly performing products to a custom with the resulting loss of company credibility. Consider it time invested in peace of mind.

Looking for parts? Go to SourceESB.

About the Author

Jack Browne Blog

Jack Browne, Technical Contributor, has worked in technical publishing for over 30 years. He managed the content and production of three technical journals while at the American Institute of Physics, including Medical Physics and the Journal of Vacuum Science & Technology. He has been a Publisher and Editor for Penton Media, started the firm’s Wireless Symposium & Exhibition trade show in 1993, and currently serves as Technical Contributor for that company's Microwaves & RF magazine. Browne, who holds a BS in Mathematics from City College of New York and BA degrees in English and Philosophy from Fordham University, is a member of the IEEE.

Sponsored Recommendations

Wideband Peak & Average Power Sensor with 80 Msps Sample Rate

Aug. 16, 2024
Mini-Circuits’ PWR-18PWHS-RC power sensor operates from 0.05 to 18 GHz at a sample rate of 80 Msps and with an industry-leading minimum measurement range of -40 dBm in peak mode...

Turnkey Solid State Energy Source

Aug. 16, 2024
Featuring 59 dB of gain and output power from 2 to 750W, the RFS-G90G93750X+ is a robust, turnkey RF energy source for ISM applications in the 915 MHz band. This design incorporates...

90 GHz Coax. Adapters for Your High-Frequency Connections

Aug. 16, 2024
Mini-Circuits’ expanded line of coaxial adapters now includes the 10x-135x series of 1.0 mm to 1.35 mm models with all combinations of connector genders. Ultra-wideband performance...

Ultra-Low Phase Noise MMIC Amplifier, 6 to 18 GHz

July 12, 2024
Mini-Circuits’ LVA-6183PN+ is a wideband, ultra-low phase noise MMIC amplifier perfect for use with low noise signal sources and in sensitive transceiver chains. This model operates...