Hi,
I seek an understanding of the difference in counts for the default settings of
the S-meter. I’m rather new in this forum, maybe it has been explained long
ago. If so I would like a reference.
Normally the difference between S-units should be 6 dB, and I’m sure TT has
tried to be precise.The counts are not far off, but the count intervals vary
rather much. However, there must be a reason for the uneven intervals in
counts. An oldfashioned AGC could easily be much more linear in S-meter
reading, so I wonder if it is caused by conflicting or anticipated operational
settings involving the DSP action? NR, AGC decay speed, hang time, RF Gain?
I was triggered by a signal on 40 m CW reading close to S+60 by a station which
couldn’t possibly (>500 km away) generate a 50 mV signal at the antenna plug.
If I had a calibrated RF generator I would of course have calibrated the
S-meter. But I don’t. Has anybody tried to check the default settings?
73, OZ6NF, Gunner
_______________________________________________
TenTec mailing list
TenTec@contesting.com
http://lists.contesting.com/mailman/listinfo/tentec
|