I am presently running barefoot at 100 watts on ~100' run of LMR-400 to the antenna. The highest SWR I am seeing is about 2.5, but mostly it hovers around 1.5 or so. According to one of the coax calculators that should put be at around 89 watts out to the antenna, at 20m. That's not too bad. Up around the 2.5 SWR region drops me down to about 86 watts out. Now - I've recently acquired an AL-80B, and would love to make use of the watts that I've paid for. I am going to shoot for 800 watts give or take, when I get it up and running. The loss for that power using the same setup starts to increase quite a bit. At 1.5 SWR I'm down about 85 watts. At the 2.5 SWR I'm down about 109 watts. I started playing around with the coax calculator and see that 1/2" heliax gets me down to about a 50 watt loss at the 2.5 SWR. Going to 7/8" heliax gets me to about 33 watts of loss. The Antenna Farm sells some 7/8" Commscope Heliax (I bought the 7/8" for my UHF/VHF station awhile back), at about $2.95 per foot. So - for me to run almost 100' would cost almost $300. I know that folks love to point out that the difference between 30 watt loss vs 109 watt loss would probably be negligible at best - but considering that I paid so much for the amp, why would I not upgrade the cable? I'm just curious to know how much of a loss is acceptable in your views?