Many years ago I was experimenting with a two element 40 meter phased array. It was composed of two quarter wavelength vertical radiators with moderately good ground systems, spaced a half wavelength apart, in my back yard. The radiators and ground systems were, as close as I could make them, identical. A friend made some measurements of field strength as I excited each element in turn. The unused element was open circuited at the base, and a directional wattmeter was used to make sure the applied power was constant. I was startled to discover a 10 dB difference between the two! And this was a capably made measurement, not some estimation from an S-meter.
Standing at each antenna and looking toward the friend’s house, the only apparent difference was that a stand of a few Douglas fir trees of approximately a quarter wavelength height were in the path from the weaker antenna. They were in my yard, roughly a quarter wavelength from the weak antenna. Receive signal measurements showed at least 6 dB difference between the two antennas, only in the direction through the trees, providing added evidence that the trees were responsible for the attenuation. Not long afterward, I moved the weak element and its ground system a quarter wavelength toward the strong one and repeated the measurements with the friend. The path from the weaker element went through the edge of the stand of trees, but no longer through the center. The measured difference between the elements dropped to 4 dB. The only way I can think of to absolutely prove that trees can have this profound an effect would be to set up a test giving similar results, then cutting down the trees and remeasuring, but that’s a test I’ve never had the opportunity to do.
In the meantime, I’m convinced that I have observed 10 dB of attenuation of a vertically polarized HF signal caused by absorption by trees, although I can’t point to exactly what criteria must be met to effect this level of attenuation.
Roy Lewallen, W7EL
(from NEC list)