I'm a little embarrassed that I don't really grasp what the dB column in WSJT-X is telling me. I know that a decibel is based on a ratio of two signal levels; that is, a decibel is a measurement compared to a reference. It is my understanding that the dB column is the dB value calculated from the ratio of signal to noise. This is where I start getting confused. Am I right to say that a value of -20 is that the decoded signal is 20 dB below the average level of noise I am receiving at that moment? (That is, the decoded signal is -20 dB lower than the reference signal, which is the noise.) Relatedly, how does the software know what noise is? How does it distinguish between signal and noise, especially when they are very close in signal level? I don't have an electronics background, let alone a signal processing background, so I am trying to understand this only in very general terms.