I would dead recommend AGAINST computer science degrees.
Science is theory and hypothesis, and does not apply to
the stark realities of the real world. You want to learn
to code better? Coding in THEORY is all well and good but
it'll be useless for a proper project. You'd do better
taking on a realistic project and devising your own
solutions. Writing and optimising your own code is the best
way to learn to code. A little formal education doesn't
hurt, but you need to do it as an "engineer", rather than
a "scientist".
Whew, I could really rant about this, but I hope someone's already noticed I've recognized to death the value of 'real' experience.
So, that said, "science" triumphs because science is
reproducible.
What science could "reproduce" in 1970 was fairly pathetic, because science didn't know enough. (Actually, it turns out it did, but moving at the speed of academia, it took a long time for that knowledge to trickle down and around to the right places.) ... What it did know was too 'expensive' to ever bring to market; note PARC, if you want.
Thus, it took kids in garages and bedrooms to make the 'breakthroughs' that science could then study. Who knew these "personal computers" would be such a hit at all?
You can't compare Woz and the like
directly to Tesla, but for practical purposes, note the parallel. Early marketers like Commodore and Atari didn't necessarily care how it worked (along the lines of Marconi), and as long as they had "more magic" than anyone else (wrought by purchasing more cool kids, like Amiga), life was good.
Well, today, science is to some extent borne out, both because, faced with the likes of the Apple II and the Amiga (and here's where the flip side came in -- the engineers of both were, of course, better 'scientists' than the scientists!), the big movers threw their scientists at the problems, and because, for absence of science, everyone hoping on miracles had a tendency to piss their money away down blind alleys.
OS/2 and NT were architected by 'scientists;' they flopped in their own ways, and the 'scientists' learned from their mistakes, found patterns in the industry, and so on. XP and the average Linux desktop environment are both infinitely more 'bloated,' yet they've taken off like wildfire (and run rather nippily in spite of themselves, to boot). Since their advances are now entrenched in the literature,
it's now hard to do worse, though especially in MS's case, the entire planet is now waiting for a slip-up.
There's no sense in fearing science; that amounts to worshipping ignorance. However, it's only bad scientists who worship science without constantly questioning and testing it. Maybe it's forthright to bring up Dillon again, but I think it's safe to hold up the cycle of DBSD development as "good science," and I think most of us here can be convinced that that system will/already_does kick a fair bit of a$$ technically. ;-)
At this point, I suppose it's worth pointing out that 'science' has even crept into purchasing departments, to the point that, yes, it's a bit harder to push snake oil these days. (Or at least, snake oil is held to a slightly higher standard, in terms of maximum-time-before-lawsuit.) You can still get by pushing it... but do you want to spend your life profiting off the suckers, or earning the respect of the wizards?
...
All that said, the worst CS profs I encountered were the ones trying to teach 'science' while trying to be pragmatic. It's one thing to remind a student of what happens in the Real World... and another to take Visual Studio as the one true hammer for all the world's screws, even if that
is the present state of belief.