Tue 22 Feb 2005
Everybody and their dog seems to be weighing in about Municipal Wi-Fi. Of course, they all have axes to grind and are taking the sides you would expect: the incumbent ISPs, telcos and cable companies are against it (sometimes in the form of poorly veiled "grassroots" organizations), while the community groups, potential users and equipment manufacturers are for it. No surprises, nothing to see here, though the outcome might be interesting.
Fri 4 Feb 2005
I get asked this question rather too often, so I'm posting my short answer here. The answer is rather more complex than it ought to be, and depends on a huge number of factors.
The most important is the receive sensitivity of your equipment. Many manufacturers fail to publish this data, but those that do will generally rate their radios by dBm at various data rates. As an example, let us take the venerable ORiNOCO Gold 802.11b "Classic" card. Its receive sensitivity is:
The signal level you receive in an unobstructed environment depends on the transmitter power, the gain of the two antennas involved, and the distance between them, as well as any loss between the antenna and the radio at each end.
In practice, radio waves behave unpredictably in a number of ways. First, the signal will fade out due to multipath effects (radio waves that bounce off objects and increase or decrease the signal that you receive). The further the receiver is from the transmitter, and the more objects between them, the higher this effect will be. Walls, people, electronic equipment, rain/snow/ice/fog are all quite effective at decreasing your signal level. In a typical home or small office environment without too many obstructions, a 10dB variation in signal level is quite normal. So, if you are looking at a NetStumbler scan and the signal is consistently around -65 dBm, it could drop to -75 dBm when somebody comes over to talk to you.
Summary so far:
(Received signal) = (transmit power) - (loss between transmitter and antenna) + (transmit antenna gain) - (path loss) - (multipath and obstruction loss) + (receive antenna gain) - (loss between antenna and receiver)
In order to operate, (received signal) must be greater than (receiver sensitivity).
Another factor is noise. This is "background" radio-frequency junk that your receiver can "hear" but needs to reject. Sources of noise include other wireless networks, cordless phones, microwave ovens, radio hams, medical equipment, Like other radio phenomena, noise may be highly variable. Many wireless network adapters do not report noise, so if you're using NetStumbler with them then you can't even tell how much noise you have in your environment. A typical urban location these days might have an average noise level around -95 dBm. When you switch on the microwave oven or take a call on your 2.4GHz phone, this value will increase. I've seen a 2.4GHz phone produce -50 dBm of noise, which is enough to saturate some Wi-Fi radios and thus kill their connection completely.
Let's take these concepts and combine them. In order to operate, the actual signal level at your receiver needs to be higher than the noise level. The actual signal level varies depending on signal fade, so if you measured -75 dBm one day, it might drop to -85 dBm occasionally. On most radios this is sufficient to make it drop to a lower data rate, and on some it will cause the connection to drop altogether. Likewise your background noise might be around -98 dBm, but then your neighbor takes a call on her cordless phone and it jumps to -78 dBm. With multipath effects, this is sufficient to make your connection drop randomly.
My conclusion, therefore, is:
Q: What signal level should I consider usable for a good wireless link?
A: Depends on your equipment and your environment.
The most important is the receive sensitivity of your equipment. Many manufacturers fail to publish this data, but those that do will generally rate their radios by dBm at various data rates. As an example, let us take the venerable ORiNOCO Gold 802.11b "Classic" card. Its receive sensitivity is:
- -94 dBm at 1 Mbps
- -91 dBm at 2 Mbps
- -87 dBm at 5.5 Mbps
- -82 dBm at 11 Mbps
The signal level you receive in an unobstructed environment depends on the transmitter power, the gain of the two antennas involved, and the distance between them, as well as any loss between the antenna and the radio at each end.
In practice, radio waves behave unpredictably in a number of ways. First, the signal will fade out due to multipath effects (radio waves that bounce off objects and increase or decrease the signal that you receive). The further the receiver is from the transmitter, and the more objects between them, the higher this effect will be. Walls, people, electronic equipment, rain/snow/ice/fog are all quite effective at decreasing your signal level. In a typical home or small office environment without too many obstructions, a 10dB variation in signal level is quite normal. So, if you are looking at a NetStumbler scan and the signal is consistently around -65 dBm, it could drop to -75 dBm when somebody comes over to talk to you.
Summary so far:
(Received signal) = (transmit power) - (loss between transmitter and antenna) + (transmit antenna gain) - (path loss) - (multipath and obstruction loss) + (receive antenna gain) - (loss between antenna and receiver)
In order to operate, (received signal) must be greater than (receiver sensitivity).
Another factor is noise. This is "background" radio-frequency junk that your receiver can "hear" but needs to reject. Sources of noise include other wireless networks, cordless phones, microwave ovens, radio hams, medical equipment, Like other radio phenomena, noise may be highly variable. Many wireless network adapters do not report noise, so if you're using NetStumbler with them then you can't even tell how much noise you have in your environment. A typical urban location these days might have an average noise level around -95 dBm. When you switch on the microwave oven or take a call on your 2.4GHz phone, this value will increase. I've seen a 2.4GHz phone produce -50 dBm of noise, which is enough to saturate some Wi-Fi radios and thus kill their connection completely.
Let's take these concepts and combine them. In order to operate, the actual signal level at your receiver needs to be higher than the noise level. The actual signal level varies depending on signal fade, so if you measured -75 dBm one day, it might drop to -85 dBm occasionally. On most radios this is sufficient to make it drop to a lower data rate, and on some it will cause the connection to drop altogether. Likewise your background noise might be around -98 dBm, but then your neighbor takes a call on her cordless phone and it jumps to -78 dBm. With multipath effects, this is sufficient to make your connection drop randomly.
My conclusion, therefore, is:
Q: What signal level should I consider usable for a good wireless link?
A: Depends on your equipment and your environment.