Attention to all technically minded amp'ers!
Maybe somebody has a good idea, that could contribute to some work I'm doing,
which might end up being useful for the ham community.
First I will explain the situation, then comes the question.
Picture the following amplifier setup: The core is a simple class E amplifier
stage, using one MOSFET, or several in parallel, with a single-band tuned
matching network, operating from a fixed-voltage power supply. This is a "black
box", having three inputs and one output: Power supply input, RF drive input,
gate bias input, RF output. Let's assume this amplifier block to be optimized
for operation at high efficiency (say, 90%), at 1500W output, a condition at
which it's driven into moderate, but not very deep saturation, from a 50-100W
drive signal.
Such an amplifier will of course be very nonlinear. But it is possible to vary
its gain over a significant range, by varying the gate bias. So, we will now
wrap a linearization circuit around this black box: A highly linear envelope
detector at the output, another such envelope detector at the input, and an
integrating comparator that drives the gate bias. Of course there is a clamp
too, that limits the highest bias to a safe value. The transceiver's ALC could
be driven from this circuit too, so that the ALC is activated when the bias
control circuit starts running out of headroom.
When this contraption is driven by a SSB signal, the bias control circuit will
at all times apply such a bias voltage that the output amplitude remains very
highly proportional to the input amplitude. The amplifier will move through
several operation classes, from class A at extremely low signals, over class AB,
right into non-saturated class C where it will stay over most of its dynamic
range, and moving into class E at the highest amplitude levels. The result will
be very good amplitude linearity, along with pretty good average efficiency over
the entire dynamic range. I expect something like 60% average efficiency in SSB,
compared to 20% or so for conventional linear amps.
Now the question: How can I get an undistorted sample of the drive signal, for
my input-side envelope detector?
The problem is that the MOSFET amplifier has strong internal feedback from the
output to the input, and as a result the drive signal produced by a transceiver,
with its non-zero output impedance, gets quite distorted. However good my bias
control circuit might be, the output envelope cannot be less distorted than the
amplifier-polluted drive signal is!
Please don't suggest placing an attenuator between the transceiver and the amp.
There are only a few dB of excess power to burn up in an attenuator, and as a
consequence the attenuator can only improve the situation by a few dB too, which
isn't enough.
And don't suggest getting the drive envelope from somewhere inside the
transceiver! That would work, but then the amplifier couldn't be simply
connected to the transceiver's antenna output anymore. This would make it
unacceptable for most hams, who wouldn't want to do surgery on their radios to
bring out an internal signal.
At this time my best bet is adding a two-stage buffer/driver at the input of my
amplifier, and getting the drive sample signal from the input of it. That should
solve the problem, but adds a significant amount of complexity.
Any ideas are welcome.
And a bonus question: Do you think that the phase distortion in such a
class-A-AB-C-E amplifier will be bad enough to still cause poor IMD performance,
even while the envelope linearity is excellent?
Manfred
========================
Visit my hobby homepage!
http://ludens.cl
========================
_______________________________________________
Amps mailing list
Amps@contesting.com
http://lists.contesting.com/mailman/listinfo/amps
|