Overclocking wastes giant blobs of electricity. Electrical consumption does not increase linearly. I think it increases exponentially.
Fortunately, it's not exponential or we wouldn't have 4 GHz machines today.
You can think of an electronic circuit as a capacitor and the clock as an AC source: the current is proportional to the voltage and to the frequency: I = C * U * F
With P = U * I you get P = U * C * U * F thus P ~ F:
20% higher clock, 20% more calculation power, 20% more electrical power
However, if you increase the voltage for better overclocking you'll notice that P increases with U * U: P ~ U²
And this is what starts burning electricity; a 10% increase in voltage uses 21% more power. Then you can possibly raise the clock by another 5% but you'll be using up 27% more power.
I experimented quite a bit with K6-II and -III CPUs and reality is extremely close to these formulas (C is highly constant with a given CPU).