This is a scheduling issue, and it is an age-old problem that plagues all OSes. It has nothing to do with hardware, and many other OSes have this problem, depending on what the system is doing.
If you play an "OS friendly" game on the Amiga, it might have studdering issues, too. You can only get guaranteed performance if you disable multitasking. Of course, this defeats many of the reasons for having an OS in the first place, and is not an acceptable option these days.
If you run the DirectX test suite, you'll almost certainly get perfect vsync with no studdering at all. I never get studdering when running the test. Ever.
Also note that many modern game consoles are having studdering/tearing issues, too. The reason is because thanks to the HD craze, developers are trying to code consoles as if they were PCs, where the software is not hard coded for just one screen resolution, but instead has to adapt to different video orientations (including widescreen). They are also using OSes instead of just writing memory blocks wherever the heck they want. Does that mean console hardware sucks, too? No, it's a coding issue. Whether it's an OS or a game issue, or both, is a matter for debate.
My take is that people just don't care about refinement these days. It's all about max framerates and bragging rights. If Microsoft were to make changes to Windows so video performance were more consistent, but was slower, you'd better believe PC enthusiasts would b**ch about it to no end.
The only solutions is to make a gaming OS, or switch the OS into "game mode". That's easier said than done, given how many games are installing kernel-mode DRM drivers and otherwise taking control of your PC away from you. It's a sad situation, but for the most part, everyone is to blame... at least in terms of software.
PC hardware today isn't what PC hardware was 20 years ago. It doesn't suck anymore. The last remaining Amiga enthusiasts still haven't figured that out.