One of my lecturers at uni was pretty high up in CPU design for IBM in the 80s and you can bet your ass they tested every possible scenario for server and desktop OS efficiency as far as parallel processing goes.
One of your lecturers who has no experience in processor design in the last 30 years tells you something and you believe him? Priceless.
While some algorithms are inherently parallel and can't be split across cores easily, there are a lot that can be. However it does take intelligence to design your code to work that way & unfortunately most programmers lack intelligence. They only know what they've been taught by people who couldn't hack it, who then become lecturers.
In the real world there are people who have written code that is efficient and can take advantage of a large number of cores. Maybe one day you'll meet one.
There's a big, big difference between massive multicore in special-purpose applications, and massive multicore for general-purpose computing, though. Graphics in particular is a task that's pretty much tailor-made for massive parallelism. Word processors, file managers, web browsers, and other unglamorous productivity software? Not so much.
Out of your example, web browsing is the only one that really needs a performance boost. With javascript and compressed video, there is definitely a need for more performance & there is no reason that can't be by using more cores.
There is a rise in using SQL databases for storage in a lot of applications that you wouldn't have thought, even Android uses SQL. A decent SQL engine will make use of multiple cores.
The future is multiple cores for general purpose computing, programmers will need to adapt.