# HF Digital Error correcting? Also, what's up with PSK31?

Discussion in 'General Technical Questions and Answers' started by N0NS, Oct 9, 2008.

Not open for further replies. 1. Steve,

Why do you say the power in the higher order sidebands increases as the frequency increases? The power in each component of the sidebands is related to the square of the voltage for that component. If the voltage of each harmonic maintains the same voltage relationship to the fundamental while increasing the fundamental frequency (e.g. 1/n for a square wave) then the power in each component should stay the same as the frequency increases. The components will be spaced further and further apart as the frequency goes up (i.e. the bandwidth will increase) but the individual power relationship should remain constant.

If you have your receiver sitting at a fixed distance from the carrier then the power of the harmonics you hear may increase as the frequency goes up since you will be hearing more and more lower order harmonics.

E.g. if you are sitting 500hz away with a 50hz keying rate you will hear the 11th harmonic at a power related to 1/11. With a 100hz keying rate you will hear the 5th harmonic whose power is related to 1/5 (almost double the amplitude or four times the power). You'll dump more power into the adjacent channel receiver at higher keying rates but that is not because the power in the harmonics goes up but because of the harmonic being heard.

I was doing some playing with the math and seem to have come up with this relationship.

If the slope of a sine wave at t=0 is (Aw) -- amplitude times radian freq -- then all of the components of a square wave have the same slope at t=0.

fundamental: slope = Aw
3rd harmonic: slope = (A/3)(3w) = Aw
5th harmonic: slope = (A/5)(5w) = Aw
......
...
.
If you treat the slope as you would for a general linear equation the slopes just directly add as you add more components (e.g. [(m1)x + (m2)x + (m3)x] = (m1+m2+m3)x. So the total slope at t=0 is:
(Aw)(nbr of terms).

I'm not sure what this "means" but it *is* interesting. I suspect it is this relationship that let's us get a pretty good square wave using only 4 or 5 harmonics. Lot's of room for bandwidth regulation there!

tim ab0wr

2. Tim - it wasn't any rigorous mathematical analysis. It was the simple fact that in both my receiver experiment, and my RC filter experiment, when I doubled the keying rate I got exactly the same output "click" (amplitude and shape), but of course at twice the repetition rate. Therefore the sidebands I was observing must have contained twice the power (same energy twice as often), even though the total power hadn't changed

Despite many errors in my earlier "rambling", I think there are a couple of points which may still be valid:

1) It seems to me that the definition of bandwidth is at the heart of the conundrum. Take a look at Figure 4 here:
http://fermi.la.asu.edu/w9cf/articles/click/click.html#f2
At first glance, the spectra for the two different keying rates have the same bandwidth. But if Mickey's analysis is correct, these two spectra have different bandwidths using the "ITT/FCC definition", and I suspect that difference comes about because of the increase in amplitude at the fundamental.

2) I also have an instinct that the drop in average power which occurs with "soft keying" when the keying rate increases, even though the peak power remains the same, may account for some of the difference indicated by the definition.

If I go back to my simple RC filter experiments, it would be possible to say that increasing the keying rate does not increase the bandwidth, if we were to re-define bandwidth something like:

"That range of frequencies which excludes those sidebands whose peak-envelope-power represents 1% of the total peak-envelope-power of the signal"

And, it seems to me, that definition might be more representative of what we humans perceive in practice. I say that because I suspect we perceive both the main signal, and the key clicks, in PEP terms rather than average power. For example, if someone sends me a steady series of DITs followed by a steady series of DAHs, I don't immediately sense that "the signal has gone up by 5dB", although that's what average power analysis would tell us. Similarly, if a series of key clicks doubles in rate, I doubt that I sense it as a 3dB increase in power .... it's just twice as annoying.

But I'm very conscious that this topic was "done to death" on the eHam thread last year, and coming to it late in the day there's probably very little I can contribute.

And this is probably not the thread for it ...... what was the question asked by the originating poster 73,
Steve

3. Steve, you are exactly right that the definition of bandwidth is the "culprit" here. As I've tried to point out a number of times, there are many different definitions of bandwidth in the engineering literature. Sklar discusses the "bandwidth dilemma" in his textbook "Digital Communications" (Prentice Hall PTR, 2001, second edition) on page 47. Here are a few quotes from Sklar:

"In summary, for all bandlimited spectra, the waveforms are not realizable, and for all realizable waveforms, the absolute bandwidth is infinite. The mathematical description of a real signal does not permit the signal to be strictly duration limited and strictly bandlimited. Hence, the mathematical models are abstractions; it is no wonder that there is no single universal definition of bandwidth."

Sklar goes on to discuss six different definitions of bandwidth in a similar fashion as Couch does in his textbook (which I previously mentioned in this thread as well as referenced in my eHam article last year). Here are Sklar's comments on the power/occupied bandwidth definition:

"Fractional power containment bandwidth. This bandwidth criterion has been adopted by the Federal Communications Commission (FCC Rules and Regulations Section 2.202) and states that the occupied bandwidth is the band that leaves exactly 0.5% of the signal power above the upper band limit and exactly 0.5% of the signal power below the lower band limit. Thus, 99% of the signal power is inside the occupied band." (Quoted from page 49 in Sklar's textbook)

As I have tried to point out many times, the occupied/power bandwidth defined by the FCC (and ITU) is not equivalent to the so-called "key click" or "interference" bandwidth. The turn-on/turn-off characteristics of the keying envelope do determine the "key click" bandwidth. I also want to point out that the two pulses shown by W9CF at http://fermi.la.asu.edu/w9cf/article.../click.html#f2) do NOT have the same essential bandwidth! The total energy of the wider pulse in Figure 1 is significantly larger than that of the shorter duration pulse; thus, 99% of the total energy of the wider pulse (T = 50 ms) is contained within a narrower bandwidth (Figure 3) than the corresponding 99% bandwidth of the shorter duration pulse (T = 20 ms) shown in Figure 2. This result is entirely consistent with fundamental Fourier transform theory (i.e., the "bandwidth" is inversely proportional to pulse duration). In Figure 4 W9CF is, in effect, demonstrating on an "absolute" energy basis that the energy density (energy per unit bandwidth in joules per Hz) of the two pulses are the same at frequencies far removed from the center frequency of 0 Hz.

Although the "key click" bandwidth is important, it is the occupied/power bandwidth as defined by the FCC/ITU that truly is the more fundamental concept in a communications system, whether we are assuming a "modern" digital system or a CW/Morse code system employing human operators. (Just ask the EME/QRSS folks about "bandwidth" and you will see!)

73, K5MC

Last edited: Oct 30, 2008
4. Yes, I guess a large part of the "problem" is caused by having the debate around a CW signal - most CW signals are very "sub-optimal" in bandwidth terms. I guess if you looked at the "real" underlying information rate (in Shannon's terms) of most CW transmissions and compared the signal to the Shannon/Hartley limit it would look pretty inefficient.

As you probably gathered, I spent a number of years researching optimum signal processing strategies for military satellite comms systems. At the time, PSK with moderate-length convolutional codes and Viterbi decoding was the best we could do. I had a pure-mathematics graduate working for me who did some pretty fancy work on convultional codes, including finding a solution to a basic maths problem that had been considered insoluble previously.

Until then I was a bit sceptical about the contribution a pure-mathemetician could make in my engineering world, but he changed my mind. He went on to develop optimum strategies for code-locking local sequency generators to incoming CDMA signals buried in noise.

It would be very interesting to see how close the QRSS folk get to the Shannon/Hartley limit ..... but we have to remind ourselves we're not always (often) working in a Gaussian noise channel 73,
Steve

5. Ok. Power is the rate at which energy is expended. That wouldn't change. The amount of energy expended would change because of the repetition rate. (power: watts = joules/sec; energy: watts x sec = joules)

Be very careful about using w9cf's analysis.

When he splits the keying window into three areas he takes the "rise time" part of the waveform and makes it into it's own short pulse with a fall time of zero. It is this windowing technique that must be used judiciously. I don't know if you remember me talking about taking a sine wave with a period of .001 sec, splitting it into ten windows of .0001 sec, and then analyzing the sine wave as if it were made up of a square wave with a period of .0001sec. You must be very careful to get rid of any artifacts generated by the windowing.

In addition, his analysis actually measures POWER, not energy density. He merely squares the amplitude of the sidebands. That is NOT energy density, at least not the way I learned it. Density has to have a "per unit factor" to be density. That requires dividing by something.

Look at his Fig 3, for example. He says he is plotting energy density in dB referenced to the carrier energy density yet at 0Hz he shows an energy density of +10db!

If he were measuring energy "expended" at any frequency (i.e. integrating over a pulse width) I could understand this. But exactly what does the energy "expended" tell us? To me it is rather meaningless.

I'm not sure what you mean by "average power". Do you mean the total energy expended? Power level is not bandwidth. Power level change (i.e. slew rate) *can*, however, mean distortion products in an amplifier not capable of handling that slew rate.

Is your filter a low-pass or high pass, I forget.

If it is a low-pass and if your keying rate is slow enough that the perceptible sidebands are well below the cutoff frequency then the bandwidth will change with keying speed.

Just don't forget the sideband spacings. The 1% PEP sideband point will be further away from the carrier for a faster keying rate unless you artificially limit the total bandwidth. Limit the bandwidth enough and you may not find *any* sidebands less than 1% of the total.

Average power analysis won't tell you that the signal has gotten any stronger for a longer pulse. An energy expended analysis will tell you that you have expended more energy but I'm not sure that tells you anything -- except like you say it's more aggravating.

Power is a rate. Power multiplied by time is Energy.

Like I said, most of us have to walk before we run. Understanding the basics is what leads to understanding the more complex. Understanding how psk31 is *so* spectrum efficient requires understanding how keying and bandwidth work.

There are always new entrants into the discussion. You've already given some new perspectives. Don't worry about "coming to it late".

tim ab0wr

6. Tim,

I haven't the time to go through every one of your points, but let me just describe my RC filter experiment in more detail - it may shed some light I have a function generator which is producing a 0/5v square wave at 10Hz, and I apply it to an RC high-pass filter. I look at the output of the filter on an oscilloscope and I see exactly what you would expect: alternate positive-going and negative-going, short-duration pulses, every 50mSec; the pulses have an abrupt leading edge which reaches 5v (or -5v), followed by an exponential decay determined by R and C. Easy so far R=3K3 and C=100nF, so the 3dB cut-off frequency is around 480HZ. Although I've described the experiment in time-domain terms, we can also consider that my filter is "showing me" the high-order sidebands above 480Hz. [I chose 480Hz to mimic what I might see in an adjacent receiver CW channel].

Now I double the frequency of the generator to 20Hz. The average input power hasn't changed because it's still a 5v signal with a 50:50 ratio. What I see at the output of the filter is exactly the same pulses but now spaced by 25mSec.

Unless I'm missing the blindingly obvious - always a possibility - that filter output is twice the average power that it was [same energy per pulse, but twice as many pulses per unit time]. Just to be sure, I put a very sensitive uA meter across the filter output, fed through a high value series resistor (so as not to load it) and a diode (because it's a DC meter and the filter output swings negative as well as positive). I assume (maybe wrongly) that the deflection of the meter is a measure of the average power output from the filter. When I double the generator frequency I see double the deflection on the meter.

So, without needing to get stuck into the Fourier analysis, this experiment tells me that, when I double the keying rate, I double the average power in the sidebands above 480Hz. So, by any definition of bandwidth which compares the average power in the sidebands with the average total power, the bandwidth must have changed.

Now where did I go wrong Steve

7. Tim,

Perhaps I should have said what I mean by "average power":

total energy integrated over a time period that is long compared to the signal period / integration period.

Steve

8. I'm not sure what your meter is indicating. Depending on the action of the meter, it could be almost anything.

Energy = jouiles. Watts=joules/sec.

If you multiply watts by seconds you get Energy in joules.

If the Power in watts is constant (i.e. not a function of time) for a specific frequency in the sidebands then if you integrate the power at that specific frequency over the time of the waveform you don't get average watts you get Energy expended.

integral of (P dt) from Zero to T is (P)(T) - (P)(Zero) = (P)(T)

Where T is the period you are measuring.

If you sum all of the powers associated with each frequency in the sideband you get total Power in the signal.

---------------------

If the signal from a higher keying rate is stronger than a slower keying rate it is because you have more sidebands in the faster keying rate getting "past" your high-pass filter. Consider a very slow keying rate where *all* of the sidebands are below 480hz. You actually wouldn't hear anything.

At 10hz you would only be getting the sidebands from 49 on up. Your signal would start at 1/49th the amplitude of the fundamental and go down from there.

At 20hzyou would be getting the 25th harmonic on up. The signal would start at 1/24th the amplitude of the fundamental and go down from there.

Your slower signal would be missing all the power contained in the sidebands from the 25th harmonic to the 47th harmonic.

Total power in the faster signal would be higher than total power in the slower signal on the filtered side of the filter.

This discussion is kind of rambling. I hope it makes sense.

tim ab0wr

9. Tim, we need to be careful about our terms. If by "keying rate" you mean the frequency of the square wave, your statement is flawed. If I drop the frequency to 0.1 Hz I can assure you I still see plenty of sidebands above 480Hz, because I see exactly the same filter output pulse occurring at 5 Second intervals. I think you're forgetting the high-frequency components which are generated by the short risetime.

Exactly - and that means the bandwidth is higher for the faster signal if we define bandwidth in terms of the average power in the sidebands compared to the average total power.

Steve

10. I LOVED Amtor when it was popular....alas, there's nobody to talk to any more. If anyone's interested in resurrecting some Amtor activity, I'm game.

eric