Which is the most common practice for calibrating meter readouts: Peak, Average, or RMS power? I ask because I have recently built a meter according to G3YNH's Duoma bridge schematic (this for inclusion into a home-brew link-coupled FRI-Match antenna tuner still being constructed). I have two MFJ antenna tuners, an MFJ-984 purchased new in 1981, and an MFJ-989C got used from a hamfest. These do not agree with each other. Both work equally well for tuning antennas, but now I'm wanting a reference for the meter I'm building. A quandary, therefor. The MFJ-984 shows my IC-745 outputting 140W, while the MFJ-989C reports 100W. On the one hand, as IC-745 is rated for 100W, I might be inclined to trust to that. Except for the fact that some years ago I had sent the IC-745 out for service to a technician. It then came back with all new caps, the 60m band added, and immediately showing higher output on the same MFJ-984 (then my one and only meter). A further quandary. I have also built a home-brew dummy load, into which I included a diode and taps for measuring power (as per a design found on-line). With a 0.01uF cap across the power tap terminals, and doing the V^2/R formula, I get different answers entirely, both higher and lower than either MFJ meter depending on whether dividing V by 1.414. Also...so I find...that diode and 0.01uF cap on the dummy load now strikes me as a not such a good way after all for measuring power. It raises SWR all by itself. And the worse, of course, the higher in F. Just wanting to do this thing as well as I can without shelling out for a Bird meter. For instruments I have a Fluke DVM and a Hitachi 60MHz scope. Recommendations?