Frequently it does. The application that works is more "good" than the application that doesn't work. The application that exists genrerally works, while the application that doesn't exist does not. Thus the application that has been written is more "good" than the one that hasn't been written.
You can say that the application isn't "good", but it is WAY more "Good" than if it didn't exist at all, which would be more likely if heavy optimization were the requirement.
Again, I'm not saying there aren't sufficient reasons to settle for sub-optimal code - yes, commercial software development takes time and money, and yes, it's more important to meet the parameters in a reasonable frame of time than to delay indefinitely in hopes of attaining perfection (this, for instance, is the reason people took to Linux and not GNU Hurd.)
But this idea that sub-optimal code is the
ideal instead of something to be settled for, just because we now have hardware on which the difference is less noticeable, is something I
will not accept. No way, nohow.
It is crap for us yes, but business is different and profit & revenue are the only concerns for corporate entities. Being unique is as good as it gets.
See, I'm not going to say that businesses shouldn't settle for what works for them as far as investment/return conisderations go. But I do not understand the now-prevailing notion that corporate financial considerations are the true measure of goodness.
Sure, it works for a business, because software to a business is either a tool to aid in the operation of the business, or a product to be created and distributed by the business. But
when did that become the goal for ALL programmers!? Why should our tastes and our ideals be defined by the considerations of some non-existent company that we aren't a part of?