Please, somebody settle a debate ;-)
I think that the signal strength of WiFi is a measurement of the amplitude of the wave-form, because the more power you put in to the signal, the stronger it is and thus the further it will go and the stronger it will be when it arrives at the receiver. My other half doesn't believe me, because he thinks that this would result in potential distortion if the transmitter and the receiver are too close to each other (because the amplitude would be too high).
My argument was that distortion (in the audio sense) occurs because the amplitude of a signal is too high for the speaker to produce it. His response was that if that were the case, the volume of audio would get louder the closer you get to the transmitter, and that it would eventually distort. The only explanation I have for this is that a radio receiver attenuates the signal if necessary to avoid distortion when it is amplified and driven through the speaker. That would mean that something like a crystal radio would in fact experience changes in volume as it moves closer to the transmitter, and potentially distort when it outputs. This wouldn't be a problem for WiFi since the received signal is converted to electrical energy not kinetic energy.
Can anyone clear this up for us? ;-)