I really doubt the FCC would care if you ran your amplifier at a
bit more than 1500 W output to make up for feedline loss, as long as the
power at the antenna didn't exceed the limit. However, they could
require that the power actually be measured at the antenna which, for
most installations, would be more trouble than it's worth. Realistically,
though, I expect this kind of enforcement is an extremely low priority.
>From an ethical standpoint, I certainly wouldn't have any problem with
operators just calculating the expected feedline loss, and adjusting
output power accordingly. (But, no, 100 feet of 9913 does NOT have 10 dB
of loss on 20 meters!)
Scott K9MA
|