A perfect 16-bit ADC would have a difference of 97.8 dB between the average noise level and the level of the full-scale sinusoid. This is a basic fact, and it is NOT violated by ADCs used in SDR rece
As long as the quantization noise is not correlated with the signal, ADC will successfully convert signals much, much smaller than the quantization step. The noise effectively interpolates ADC steps
Yes, it is. Yes, that's the essence of it. 73, Sinisa YT1NT, VE3EA _______________________________________________ TenTec mailing list TenTec@contesting.com http://lists.contesting.com/mailman/listin
That is another matter. Dithering removes nonlinear effects of ADC steps, but it does not remove other forms of nonlinearities. Due to ADC nonlinearities (other than quantization steps), some distort
Filters like those have trouble in achieving 40 dB attenuation in the next/previous ham band. At +/- 750 kHz (or +/- 1.5 MHz) attenuation is practically non-existing. 73, Sinisa YT1NT, VE3EA _______
Hi Rick, that's what I had in mind. Taking the 7 MHz filter as an example, there is 40 dB attenuation on ~3.5 MHz. But on 6 MHz the attenuation is practically nil. Your preselector is the right tool
Hi Rick, that's what I had in mind. Taking the 7 MHz filter as an example, there is 40 dB attenuation on ~3.5 MHz. But on 6 MHz the attenuation is practically nil. Your preselector is the right tool
Only about 10 dB of attenuation 1 MHz away from the tuned frequency. Not exactly brilliant. 73, Sinisa YT1NT, VE3EA _______________________________________________ TenTec mailing list TenTec@contesti