I ran into an issue while examining interactions among antennas on a planned
tower: tuning unused feedlines to minimize parasitic behavior of idle
antenna elements. This aspect of design seems relevant to optimizing
antenna system performance (and if disregarded, seems liable to distort the
carefully-crafted radiation patterns of certain OWA installations.) I'm
interested in finding more information.
Conventional wisdom is that a 40m dipole should be spaced a good distance
from a 15m yagi, because the 40m dipole is resonant on 15m, and interferes
with intended operation of the 15m yagi. The same wisdom suggests that no
such issue arises when a 20m antenna is near a 40m antenna (or near a 10m
antenna).
But I'm reminded that it isn't necessarily so.
A 40m antenna generally is non-resonant on 20m, because its element(s) are a
full wavelength in length. (Ditto a 20m antenna on 10m, and an 80m antenna
on 40m.) But that's with an uninterrupted radiator. With an OWA (or with a
dipole for that matter), the driven element is split. Physically, such a
40m antenna isn't a full wave on 20m; it's two half waves.
How it behaves electrically depends on the feedline that joins the two
element halves. The length of the feedline, and its termination, will
determine whether the two halves of the 40m element behave as effectively
joined into a single element on 20m (i.e., a non-resonant full-wave), or
whether they behave as two 20m half-wave elements. The latter would serve
as unintended parasitic resonant elements on 20m - with detrimental effects
on any 20m antennas nearby. (Ditto a 20m OWA antenna messing with a nearby
10m antenna, and an 80m dipole behaving as two parasitic elements on 40m.)
The installation I'm considering has a remote antenna switch. The switch
grounds the feedlines of unused antennas. If I'm working on 20m, and the
40m feedline is grounded, the electrical behavior of that split 40m element
on 20m will depend on the length of the 40m feedline. If that feedline
length is an even number of quarter-waves (on *20m*), the shorting of the
coax at the switch will appear as a short at the feedpoint of the 40m
antenna - effectively joining the halves of the split element into a
non-resonant full-wave on 20m.
If, however, the feedline length is an odd number of quarter-waves on 20m,
the opposite occurs. The short at the switch appears as an open at the
feedpoint. The 40m driven element appears as two disconnected half-wave
elements on 20m. They then serve as parasitic elements for the 20m yagi,
albeit not in the same plane - distorting its radiation pattern.
Commonly, stacked yagis are fed with feedlines that are an odd number of
quarter wavelengths at the operating frequency. Such feedlines - if shorted
when not in use - should avoid this problem. (An odd number of quarter
wavelengths on 40m is an even number on the second harmonic 20m; ditto 20m
and 10m.)
The conventional wisdom about 40/15m antennas seems subject to the same
feedline dependencies. If I'm operating on 15m, the conventional "bad"
interaction occurs if the split element of the 40m antenna is effectively
shorted on 15m, in which case the element appears as three half-waves. But
this effect is avoided if the transmission line to the 40m element is cut to
be an odd number of quarter wavelengths on 15m, and is shorted at the switch
when not in use. A quarter wavelength feedline on 40m is three quarter
wavelengths on 15m. The short at the switch appears as an open at the split
element feedpoint, on 15m. This open breaks the 40m dipole element into two
non-resonant halves on 15m, each 3/8 wavelength long. (Of course, if the
40m antenna is a yagi, then its non-driven elements are not split, so will
always present a resonance issue near 15m.)
Of course, in any of the above cases, if the unused feedline is left
open-circuited (e.g., terminated in a patch panel without further
connection), then a feedline that is an odd number of quarter waves in
length seems to be exactly the worst choice.
To play with a simple example, I modeled a single-wire 28MHz dipole (e.g.,
16.8' long) in EZNEC at 70' over real ground, with a source at midpoint.
The max radiation is at an elevation of 7 degrees, with a null overhead. I
then added a second wire - a 14MHz dipole (33.6' long) at 60'. I connected
End1 of a 17.56' transmission line (VF=1, lossless) at the midpoint of the
20m antenna, and checked the radiation pattern of the 10m dipole at 28 MHz
with the remote end of the transmission line shorted. The pattern is
essentially unchanged from the 10m dipole alone. Then I open-circuit End2
of the transmission line, and checked the pattern again. The max radiation
lobe of the 10m dipole jumps up to 57 degrees, the desired low-angle lobe is
attenuated, and a significant overhead lobe appears. (Fortunately, this
phenomenon seems sharply frequency dependent: at +/- 250 KHz, the effect is
essentially nil. Also, the more lossy the transmission line, the less
pronounced is the parasitic effect.)
Perhaps the best approach - absent a detailed analysis -is simply to avoid
feedlines that are multiples of quarter wavelengths at any frequency of
potential interest. That way, a pure short, or pure open, is never
impressed across the feedpoint of an idle antenna - regardless of whether
the feedline is terminated in a short- or open-circuit.
I haven't seen this issue squarely dealt with in the literature - 'makes me
wonder if I'm imagining these interactions. (I've seen lots about quarter
wave feedlines, but that's often for the purpose of forcing equal currents
into phased radiators from a power splitter.) Others on the list are more
widely read than I - where has this issue of idle feedlines tuning idle
antennas to unwanted resonances (or to desired anti-resonances) been more
fully addressed?
Tnx,
Bill, K2PO/7
_______________________________________________
_______________________________________________
TowerTalk mailing list
TowerTalk@contesting.com
http://lists.contesting.com/mailman/listinfo/towertalk
|