Overclocking wastes giant blobs of electricity. Electrical consumption does not increase linearly. I think it increases exponentially. It would be smarter to underclock it. Then next year buy yourself a new computer with all the money you saved on your electric bill from not overclocking it.
Not exactly true. Overclocking uses a less than linear increase in power *until* you increase voltages. Only then does power consumption rise drastically (in fact Ive even had a core2duo that I overclocked and undervolted, requiring less power than stock frequency).
Some architectures overclock reasonably well without being overvolted.