More WiFi Madness

Last time we were left with a question:  if RSSI is a measure of signal strength, and if link quality is a measure of the signal-to-noise, and if Windows provides a standard formula for converting link quality to dB, why does the Amped Wireless adapter report an RSSI value (dBm) that is lower than the signal-to-noise ratio?  We can’t have negative noise, can we?  (It seems some manufacturers might think we can.)

You may recall that this investigation started when a customer reported that an app that I’d written was reporting signal strengths about 10dB lower than those reported by InSSIDer Office.  I eventually realized that they seem to be reporting an RSSI value calculated from the link quality reading reported by Windows, whereas I was reporting the actual RSSI value.  This leaves us with a very important question:  which value matches the actual received signal strength.

It turns out that measuring signal strengths with most wifi equipment can be difficult.  The wifi spectrum is very crowded, making it difficult to directly measure signals over the air.  Wifi is bursty and your signal is interleaved with all your neighbors transmitting on the same channel.  How can you be sure you’re really measuring the right thing?

The obvious answer is to measure your signal on the bench, cabling your equipment together to keep out unwanted interference.  Unfortunately this doesn’t always work very well in practice because most wifi radios (and especially this Amped Wireless USB adapter) are not well shielded and signals leak in like you wouldn’t believe.  The last time I did this I had to put the receiver in a shielded box about 50 feet down the hall and connect it to the transmitter with a high-quality cable just so it wouldn’t pick up the stray signal radiated out of the case rather than the connector.

I happen to live in a forested rural area with very few neighbors, and this really simplified things because the only signal I see on wifi channel 1 is from my home access point.  I first recorded the signal strength as reported by my app and by InSSIDer Office (-58 dBm vs -50 dBm), and then I measured it directly with a Tektronix RSA306 Real-Time Spectrum Analyzer.  I was careful to mount the antenna and cable in place so it wouldn’t move when switching between the Amped Wireless radio and the spectrum analyzer.  (A slight change can make a big difference in received signal strength, as can variations in the environment.)  After repeating the measurement several times I determined that the actual received signal strength at the antenna was -57 dBm.  After repeating this whole experiment with a few different signal strengths I convinced myself that the RSSI value reported by the Amped Wireless adapter is a better measure of signal strength than the calculation from link quality.

So why do some applications use link quality instead?

It turns out there’s another Windows Native Wifi call that returns signal strength for the currently connected network: WlanQueryInterface().  One of the options you can pass to this routine is wlan_intf_opcode_rssi, which returns an RSSI value for the specified interface.  Unfortunately, different wifi adapters return different values for this call.  I’ve seen some adapters/drivers that return dBm, some that return 0 – 100, and some return other ranges.  This variation makes this RSSI value difficult to use.

In the course of this investigation I’ve learned that the Intel adapter in the laptop I’m using to write this post returns values from 0 to about 100, where the conversion to dBm appears to be approximately -95 + rssi.  The Realtek driver for the Amped Wireless card on Windows 10 returns values from 0 to 255, and the conversion to dBm seems to be treating this value as a signed byte (-128 to 127) and adding -50.  I seem to recall some older Cisco radios returned 0 – 100 but require a table lookup to convert to dBm.  In my experience none of these conversions are documented, other than the occasional obscure research paper or mapping table that pops up in deep, dark corners of the internet.  With all this variation it’s no wonder that developers would rather use link quality to approximate signal strength, which all radios seem to report with a reasonable consistency.  That way their application works reasonably well with all wifi adapters.

Now back to the original question:  why is the signal strength reported by RSSI 10 dB lower than that calculated from the link quality, which we’ve assumed to be a measure of signal-to-noise?

My current theory is that manufacturers are playing a bit of a game with this value.  Let’s say someone comes out with a new wifi adapter that has an low-noise amplifier (LNA) in the front end.  This will improve the receiver sensitivity, allowing it to receive weaker signals than other adapters without an LNA.  So they would naturally want to differentiate their product by reporting a higher link quality number.  After all, if a -90 dBm signal can barely be decoded by a normal card, but their adapter can receive signals all the way down to -100 dBm, their reported link quality should be 10 dB better, right?  We can’t really fault them for this reasoning, because the quality of their link would really be better than the other card.  The problem really stems from Microsoft claiming a conversion between link quality and dBm, and that’s not the manufacturer’s fault, eh?

Posted in Uncategorized | Leave a comment

Windows WiFi Madness

Years ago I’d written a wifi scanning app that logs RSSI data from nearby wireless access points.  This app retrieved signal strengths from Windows using what was then called the Wlan API, since renamed Windows Native Wifi.  The WlanGetNetworkBssList() function returns a list of WLAN_BSS_ENTRY structures, each containing information about a detected WiFi BSS (think access point).  This structure has an lRSSI field documented as returning “the received signal strength indicator (RSSI) value, in units of decibels referenced to 1.0 milliwatts (dBm).”  How convenient–dBm is exactly the reference everyone in the wireless industry uses for signal strength.  I logged the lRSSI field for all detected access points and called it a day.

But a few days ago I received an email noting that the values logged by my app don’t match those from a popular wifi troubleshooting app called inSSIDer Office, and asking me to investigate.  I downloaded a trial copy and found that indeed, my app was showing signal strengths that were generally about 10 dB lower than reported by inSSIDer Office.  What gives?  I’m using the official Windows API–what could be better?  (Don’t laugh.)  By the way, while wifi scanners are now a dime a dozen, inSSIDer Office seems like a pretty cool tool.  I didn’t play with all its features, but it did seem to show exactly the information you need to understand your local wifi environment.

Now to the investigation..

I checked the google to see if Windows had changed their wifi API and now had a different/better way of getting RSSI.  No luck–other than renaming it Native Wifi it looked the same, and the WlanGetNetworkBssList() function was still the way to go.  (After all, Windows would never replace an API, would they?)

Next up was a closer look at the data.  I wrote a quickie test app to display relevant data from the WLAN_BSS_ENTRY list and ran it along-side inSSIDer Office to get a feel for how the data was different.  The first thing I noticed was that both apps reported the same signal strength for very strong signals (greater than about -50 dBm).  Between -50 and -65 dBm readings were about 5 dB lower on my test app, and below that they were about 10 dB lower.  Very interesting, and what affects weak signals more than strong signals?  Noise.

The WLAN_BSS_ENTRY structure has another field called uLinkQuality that returns a value from 0 to 100 and indicates the link quality.  I understand this to be a rough measurement of the signal-to-noise ratio (SNR), or the extent which the received signal exceeds the noise floor.  In practice, wifi radios can’t easily distinguish noise from interference, so the link quality field actually reports the ratio of the signal to noise+interference.  I seem to recall that the wifi specs use the term link quality indicator (LQI) for this same measurement.

A closer examination of the data suggested that inSSIDer might actually be reporting the signal-to-noise+interference radio calculated from the link quality metric instead of reporting the actual signal strength.  Windows even suggests a formula for this:   (uLinkQuality / 2) – 100.  Note that this isn’t the same as the received signal strength, we’re talking two different things here.  RSSI indicates the actual signal strength, while LQI is how much of the signal is visible above the noise.

I modified my test app to display the SNR as calculated from the link quality metric, and it matched inSSIDer Office exactly.  So while they claim to show dBm (received power relative to 1 milliwatt) it seems like they’re actually reporting SNR, which is a ratio and should be expressed in dB, not dBm.

So now we know why the two apps are showing different values, but what do we do about it?  Which value is best?  If you’re measuring coverage ranges you generally want to use the actual received power (dBm) as your metric.  If you care about reliability you probably want to use the link quality indication because it includes the effects of noise and interference.

Most wifi installations are in congested areas.  All your neighbors have wifi and other devices interfering with your wireless network.  So inSSIDer is probably showing the most useful value to their customers, even if it is incorrectly named.

P.S.  If you’ve followed this closely you may have noted that SNR should be smaller than the RSSI reading in the presence of noise, but the signal quality metric shows a higher value than RSSI.  More about this later.

Posted in Uncategorized | Leave a comment

Hello world!

Coming soon:  my quest for safe and reliable storage in the age of ransomware.

Posted in Uncategorized | Leave a comment