First: use Signal Strength (RSS) for estimating distance?
Why not just use the inverse-distance-squared law to estimate distance
from the access point (AP)?
The scattergram on the left shows about 20,000
RSS (Received Signal Strength) values (vertical axis, in dBm)
versus actual distance between smartphone and access point
(horizontal axis, in meters) in a typical three level house.
The solid curve corresponds to the inverse square law —
which is clearly not a good fit to the observed signal strength
(other than providing an upper bound).
More importantly, the spread in RSS for a given distance is huge,
making inversion to estimate the distance from RSS ill posed.
No path-loss model, no matter how complex, can overcome this problem.
By the way, the observed decay with distance better fits that of a
signal passing through an absorbing medium.
If we subtract the expected inverse square drop off
from the above, we are left with the scattergram on the right.
An attempt to fit a linear relationship between excess path loss
and distance leads to a slope of -1.1 dB per meter
(although with a huge error term),
as indicated by the dashed line.