Answer: When you don’t read the standard properly!
I was verifying my EN 61000-4-6 conducted RF immunity test setup after the construction of some new test adaptors and acquisition of some new equipment when came across something that left me scratching my head. I figured it out eventually and updated my calibration procedure with a note but it did have me puzzled for an hour!
Like most conducted immunity signal generators, the one I use combines a signal generator and modulator with a power amplifier and some front panel controls/readouts for performing the basic functions. It also has an RF Input for calibrating Coupling/Decoupling Networks (CDNs) which measures the voltage at the 150/50 ohm calibration adaptor and sets the output voltage of the generator to the correct level. My generator has a LED bargraph display showing the level which provides a reassuring visual confirmation that everything is OK.
Confused by Conducted, Stumped by the Scope
Having calibrated my new CDN at 3V, since I had a scope within reach, I decided to run the test but monitor the output of the calibration adaptor with the scope to make sure it was all working OK.
I did not see the expected 3V level, instead the RMS measurement on the scope was 0.5V and the pk-pk was just over 1.5V. I checked my 50 ohm thru termination on the scope input and even swapped it for a different one. My second scope also read the same voltage so it clearly wasn’t the scope. Puzzling.
I swapped the CDN for one that had been previously calibrated CDN and the lower than expected output voltage persists. Try turning up the generator voltage to 10V and I can’t even achieve 3V on the scope. Yet when I swap the connection from the scope to the RF generator it proclaims that yes, that is indeed the level that the generator says it is outputting.
Putting a BNC T-piece in series and monitoring the voltage with the RF input terminating the signal still achieves the same result. Is the generator RF input broken and reading the wrong voltage?
I checked the operating manual of the generator – the cal setup I’ve been using for years is correct. Then I carefully read the standard, focusing on the section that deals with calibration of the test adaptors. All became clear…
Open Circuit Voltage vs Loaded Voltage
EN 61000-4-6 specifies the test levels in terms of Uo, open circuit voltage. However the generator level setting part of the calibration is based on a measurement of Umr, the measured output voltage. This is a slightly simplified version of Figure 9 from the 2014 version of the standard showing the impedances of each part of the system.
Tucked away at the bottom of the calibration section is the formula that links the two together.
Uo = Umr / 6
Which yields the following values that the input of the generator or the scope should be looking to measure:
For the measurement, the impedance of the decoupling part of the CDN is big enough that the termination of the AE port is not significant to the measurement, making most of the current flow through the EUT port network. You should be able to open or short the 150 ohm AE port termination and not see the measured output voltage change significantly.
By simplifying the above image and a bit of Ohms law you can clearly see that Umr is 1/6 of Uo.
Of course these are RMS voltages. If your scope that you are measuring with doesn’t have an RMS function then you’ll probably be measuring the peak to peak voltages. The conversion factor is:
Vpk-pk = Vrms x 2 x sqrt(2)
Which when added to the above table makes life a bit easier.
Armed with this new knowledge I revisited my calibrations to find that yes, everything was measuring correctly. The RF generator, being designed specifically for conducted RF immunity testing, takes care of the divide by 6 in it’s calculations.
As an ex-colleague was often heard to remark “every day is a school day” and today’s lesson was a good one. I hope this article saves you a bit of head scratching next time you are verifying your conducted RF immunity test setup.