Jim,
I disagree that the ARRL test method is inaccurate. There are standards and
there are standards, "The nice thing about standards is that there are so
many to choose from". (Well known industry maxim).
Simple test methods that have traceable calibration and proof steps are just
as valid as the $100,000 do it all Spectrum Analyzer and more so if the
Spectrum Analyzer is not in traceable calibration to NIST. In fact, we have
in our lab, some spectrum analyzers, that despite $20,000 cost and being top
of the line at the time, never held calibrations as well as separate signal
generators and meters. We spent most of our time with them just making sure
they would hold calibration for a day. (Most spectrum analyzers now have
some kind of built in signal source, but these can be inferior to use of
external standard signal generators,and of course do not replicate off the
air signals).
We often use simple traceable methods with audio detected outputs to cross
check the internal results of spectrum analyzers which are subject to
overloading themselves and all sorts of diverse problems unless used
appropriately. Just because something has a digital readout does not make
it right! Far from being jury rigged testing, the ARRL Lab has a
reasonable and demonstrable procedure publicly published so that all
manufacturers have a level playing field and know how to interpret ARRL
measurements.
I would bet if you checked many of the equipment manufacturers, you would
find their calibration sticker out of date on that $100,000 spectrum
analyzer if they have one at all; for it costs quite a bit just to certify
that such instruments are not giving you garbage out results. They are far
better than the early ones, but they had further to go to be reproducible
results instruments than individual signal sources and detectors and readout
devices. Even the vaunted spectrum analyzers are calibrated by standard old
signal generators, passive but accurate precision attenuator boxes, and
linear dB meters for readouts to check the A/D and digital read out
sections.
The challenge Jim, is not doing accurate measurements, but overcoming the
noise floor of instruments, and set up variations that could mask the true
results with local interference sources, or sneak paths, or the need to test
something with probes sticking in places that are normally closed up and
shielded.
Before ARRL introduced a standard and accurate method, manufacturers could
and did cobble together any old test gear and make any claim they wanted
about their specs. There were at least two or three competing ways to
measure receivers. And dare I mention the "accuracy" of rig manufacturers
in regard to S meters? There are about as many variations in how to
detrmine S9 as there are rig manufacturers. They have gotten better in
recent years, but there is still a variance when you think the signal heard
is S9 plus and the S meter only reads S6. I have seen plenty of commercial
receivers like that, including some that are considered top of the line now,
by their bells and whistles and cost.
Oftentimes, I have seen a review in QST raise a question about variance from
manufacturer's published spec, and then give the maker a chance to check out
the test radio, and indeed they find a problem and it retests fine in the
ARRL Lab. ARRL gets their radios for test from open stock and not the
special souped up prototypes; they are pulled off the production line in
random order to represent the average quality of units sold.
I know W1RFI to be very conscientious and concerned that the ARRL Lab
results are true examples and reproducible by similar equipped labs. And I
know from past reviews and re-reviews of various equipment that a dialogue
for improving the product goes on between ARRL Lab and the submitting
company. Updates of a product often include things ARRL cited in the first
reviews. Just take a look at ARRL Product reviews over the past 10 years
and you will find cases of this.
73, Stuart K5KVH
(responsible for Quality Control and Calibrations in a government funded
data reduction lab dealing in A/D calibrations, instrument calibrations, and
data accuracy analysis.)
|