• Forum
  • Lounge
  • How does a computer know how good it's w

 
How does a computer know how good it's wireless connection is?

I don't mean the speed, I mean like, the bars, or percentage. How does it know the quality of it's connection?
It sends out some packets, and sees how many were responded to. That's how it works basically.
closed account (S6k9GNh0)
So, it's based on packet loss?
I thought there was something about interference levels or something?
closed account (3hM2Nwbp)
I thought it was the level of attenuation in the signal. ( http://en.wikipedia.org/wiki/Attenuation - Edit - just noticed, the girl portrayed in that article is pretty cute :P ) There's most likely a standardized optimum signal strength, and the actual strength might be found by comparing the incoming signal with the optimum. This is just a stab in the dark on my part, but I got down to 1 bar on my laptop and pinged google 50 times with 0% packet loss, so I'm not too sure about packet loss.
Last edited on
Actually, I think original question was a bit vague...he could mean "Signal Strength" or "Signal Quality". Although generally they both increase as you get closer, you could theoretically have a very high strength signal (low attenuation if I understand it) with bad quality (lots of bad packets, noise, etc).
I always thought you could just see the "bars" like on a cell phone service and it tells you the signal strength. That's how I know mine is good/bad.
Last edited on
I sure hope you're trolling...
It's both guys, come on this is radio communication 101. Your signal strength goes down due to attenuation. 802.11 doesn't just send one packet from your router to your PC for every piece of data that it transmits. It's actually a complete cluster **** of a standard IMO. When you access a web page for instance your router sends data to the device over and over again until it gets a response from that device saying that it recieved the data or it times out, this passes for one "error correction" method that is used. Then it either moves to the next packet in the queue or just stops altogether.

VERY General Over view of 802.11 standard:
While the two devices remain in contact they transmit a clock signal from the router to the device and back again, the difference in the time from that packet between the device and the router determins the signal strength. Loss of the clock signal due to a dropped packet or whatever does not mean immediate loss of the signal between the devices, rather this is ignored up until a predesginated number of packets are lost. Keep in mind if a disconnect signal is sent from the device to the router, visa versa, then this will also cause a the signal to stop.

Side note the reason that 802.11 is so GOOD is because it screams on every channel you allow it to even so far as to drown out other signals in the area.
Last edited on
Again VERY Gerneral Example: My Access Point (AP) sends a timing signal to my wireless enabled phone, the phone saves the data in that packet then my phone sends the data back to the AP so that it knows I'm still here; this is done at regular time intervals in a pattern determined by the spec on the AP. If the AP sends another packet to my phone and that data isn't what my phone was expecting to be the next data in the series then my phone excepts the data (it has to hint hint to aspiring grey hats) but then it knows that the signal is not at full strength so you lose a bar or what ever indicates a weaker signal.
Topic archived. No new replies allowed.