Last time we were left with a question: if RSSI is a measure of signal strength, and if link quality is a measure of the signal-to-noise, and if Windows provides a standard formula for converting link quality to dB, why does the Amped Wireless adapter report an RSSI value (dBm) that is lower than the signal-to-noise ratio? We can’t have negative noise, can we? (It seems some manufacturers might think we can.)
You may recall that this investigation started when a customer reported that an app that I’d written was reporting signal strengths about 10dB lower than those reported by InSSIDer Office. I eventually realized that they seem to be reporting an RSSI value calculated from the link quality reading reported by Windows, whereas I was reporting the actual RSSI value. This leaves us with a very important question: which value matches the actual received signal strength.
It turns out that measuring signal strengths with most wifi equipment can be difficult. The wifi spectrum is very crowded, making it difficult to directly measure signals over the air. Wifi is bursty and your signal is interleaved with all your neighbors transmitting on the same channel. How can you be sure you’re really measuring the right thing?
The obvious answer is to measure your signal on the bench, cabling your equipment together to keep out unwanted interference. Unfortunately this doesn’t always work very well in practice because most wifi radios (and especially this Amped Wireless USB adapter) are not well shielded and signals leak in like you wouldn’t believe. The last time I did this I had to put the receiver in a shielded box about 50 feet down the hall and connect it to the transmitter with a high-quality cable just so it wouldn’t pick up the stray signal radiated out of the case rather than the connector.
I happen to live in a forested rural area with very few neighbors, and this really simplified things because the only signal I see on wifi channel 1 is from my home access point. I first recorded the signal strength as reported by my app and by InSSIDer Office (-58 dBm vs -50 dBm), and then I measured it directly with a Tektronix RSA306 Real-Time Spectrum Analyzer. I was careful to mount the antenna and cable in place so it wouldn’t move when switching between the Amped Wireless radio and the spectrum analyzer. (A slight change can make a big difference in received signal strength, as can variations in the environment.) After repeating the measurement several times I determined that the actual received signal strength at the antenna was -57 dBm. After repeating this whole experiment with a few different signal strengths I convinced myself that the RSSI value reported by the Amped Wireless adapter is a better measure of signal strength than the calculation from link quality.
So why do some applications use link quality instead?
It turns out there’s another Windows Native Wifi call that returns signal strength for the currently connected network: WlanQueryInterface(). One of the options you can pass to this routine is wlan_intf_opcode_rssi, which returns an RSSI value for the specified interface. Unfortunately, different wifi adapters return different values for this call. I’ve seen some adapters/drivers that return dBm, some that return 0 – 100, and some return other ranges. This variation makes this RSSI value difficult to use.
In the course of this investigation I’ve learned that the Intel adapter in the laptop I’m using to write this post returns values from 0 to about 100, where the conversion to dBm appears to be approximately -95 + rssi. The Realtek driver for the Amped Wireless card on Windows 10 returns values from 0 to 255, and the conversion to dBm seems to be treating this value as a signed byte (-128 to 127) and adding -50. I seem to recall some older Cisco radios returned 0 – 100 but require a table lookup to convert to dBm. In my experience none of these conversions are documented, other than the occasional obscure research paper or mapping table that pops up in deep, dark corners of the internet. With all this variation it’s no wonder that developers would rather use link quality to approximate signal strength, which all radios seem to report with a reasonable consistency. That way their application works reasonably well with all wifi adapters.
Now back to the original question: why is the signal strength reported by RSSI 10 dB lower than that calculated from the link quality, which we’ve assumed to be a measure of signal-to-noise?
My current theory is that manufacturers are playing a bit of a game with this value. Let’s say someone comes out with a new wifi adapter that has an low-noise amplifier (LNA) in the front end. This will improve the receiver sensitivity, allowing it to receive weaker signals than other adapters without an LNA. So they would naturally want to differentiate their product by reporting a higher link quality number. After all, if a -90 dBm signal can barely be decoded by a normal card, but their adapter can receive signals all the way down to -100 dBm, their reported link quality should be 10 dB better, right? We can’t really fault them for this reasoning, because the quality of their link would really be better than the other card. The problem really stems from Microsoft claiming a conversion between link quality and dBm, and that’s not the manufacturer’s fault, eh?