Of course writing assembly code is not a requirement. Another myth created around the time of RISC is that compilers would get better and better and assembly would become obsolete.
It has become mostly obsolete though. Only if you really must have the very fastest possible code for a certain operation will you bother with hand optimized innerloops.
What I see though, is many compilers generating worse code or having trouble utilizing the countless optimization techniques they advertise. Many times they can't do basic optimizations correctly. GCC is one of the worst. It's degenerated in speed and code density since version 2.
That's why I mostly use gcc 2. gcc 4 does however create faster code overall. Code density is worse though.
Have you looked at the code that newer versions of GCC generates? Oh yea, probably not since PowerPC assembler code is tedious instead of fun.
Actually I do, quite often. Tracking down bugs usually requires manual reading of the disassembly.
Remember how much easier debugging and looking at how optimized code was at a glance? Not that PowerPC is bad but readable assembler is almost totally discounted when it's actually very valuable.
68k of course is much more readable. I do like 68k still of course but I find less and less use for my 68k skills.
Understanding any assembly (even 68k) is a good thing though as it makes it much easier to pick up some other (more relevant) assembly code. The basic concepts are very similar. For example I have no difficulty in using IDA Pro to read some x86 code, even though I probably couldn't write single app using x86 asm.