Assuming someone had a transceiver/ 50 Ohm cable / HF antenna setup with a perfect 1:1 SWR then changed just the cable from 50 Ohm to 75 Ohm, what would be the change in SWR? I am asking this because I want to know how important the impedence match of the cable is and how much that alone would change the SWR. I read somewhere that 75 Ohm cable is more efficient than 50 Khz cable. I am also wondering why the "standard" for 2 ways radios is not something higher than 50 Ohms because 50 Ohms requires higher voltages than 75 Ohms for a given wattage and that is a potential safety issue. Another reason I ask is because what if someone got a free long piece of 75 ohm "cable TV" cable and they wanted to use it to wire up some HF antenna for testing such as 10 meter. How good or bad might it be as far as SWR? Would the length of the cable affect how bad the SWR deteriorates? Can an antenna tuner be used to help compensate for the 75 ohm cable mismatch if the operator doesn't care about maximum output power, just a cheap way to test HF antennas? What if someone used 2 equal lengths of 75 ohm cable wired in parallel and coupled together? Would that be a reasonable match for 50 ohm?