(I posted this same query at the rtl-sdr.com forum, let's see which forum "wins" the game!) This is fairly deep technical question that applies to all SDRs, purely my curiosity, posting on the chance there is an SDR expert out there who can answer it in a simple manner, or post a hyperlink... I have always wondered how the raw ADC samples in the RTL-SDR dongle (28.8 megasamples/second?) are processed and downsampled to the lower rates needed to transfer data out of the dongle to the PC software for a given bandwidth setting. In other words, a 2 MHz bandwidth setting requires a lot more bits/second from the USB connection than a 1 MHz setting, but how exactly is the original raw data from the ADC sampling transformed in the dongle? Is a FFT done, followed by filtering at the selected the bandwidth, then a reverse transform? Or some kind of mathematical averaging on the entire stream to downconvert it?