I'm looking for collaborators (or even just suggestions on how to
approach it) to look at doing antenna modeling using cloud resources
(like Amazon Lambda).
Amazon will give you 400,000 Gigabyte-seconds of compute per month for free.
The cores are faster than the ones I'm running here, but as an example,
a model of crossed dipoles on a structure that I ran recently has 800
segments, and I ran 150 frequency points, with two excitation points,
and produced a single plane pattern cut of 180 points (every 2 degrees).
It was 170 seconds and was well under a gigabyte of memory consumption
So, for a free allocation, I could have run that model a couple thousand
times a month.
This is interesting because often times we'd like to run a systematic
variation on a complex model of multiple antennas. For instance,
earlier this week I was modeling the interaction of two 20m yagis -
those runs are around 1-2 seconds. If you run every 5 degrees of skew,
that's 72 runs. Start looking at changing the relative heights or
spacings, and you can get into the thousands of runs pretty easily.
Or - examining the effect of the soil properties changing (using
SommerfieldNorton ground) - systematically covering a range of
parameters can add up a lot of runs pretty quick.
I'm interested in this to speed up the "time to get results" - and for
some applications, I'd be happy to spend a few bucks - so I'm not as
concerned about the "run it for free" aspect.
Amazon charges 0.00001667 per GB-second after you bust the "free" limit.
So, in round numbers, I think that will work out to about 60,000 NEC
runs per dollar. And, since you can spin up an almost unlimited number
of instances at once - you could get your answer back from a 10,000
iteration study in a few minutes.
_______________________________________________
_______________________________________________
TowerTalk mailing list
TowerTalk@contesting.com
http://lists.contesting.com/mailman/listinfo/towertalk
|