As others have said, Commodore did design a 16-bit Paula replacement, but it was never put into production.
The reason they just stuck with the 8-bit chip was probably that, to most people even in 1993 (i.e. 90% of the users), digitized 8-bit audio was "good enough" for games and other non-critical applications. Remember, most people were using 1084 monitors with built in (crappy) speakers. Some of these 1084 monitors were even monophonic.
The only people that would have noticed a real difference between 8 and 16 bit audio, or cared enough to pay for 16-bit audio were musicians and audiophiles.
Sure, if you actually played a 16-bit audio sample beside an 8-bit audio sample, one after the other, most people would probably have noticed the improvement in clarity. But without them playing side-by-side to spotlight what was missing, most people at the time probably though 8-bit audio was just fine for their needs.
I would suspect that a lot of people today wouldn't even notice or care about the difference between 8-bit audio and 16-bit audio. My wife, for instance, doesn't care if her favourite song is playing on the little, tinny FM mono radio we have in the kitchen, or our big "hi-fi" stereo - as long as she can hear and enjoy it.
Look at the success of MP3, for instance. Technically, we've taken a step backward in audio quality from Compact Discs but the public is loving it (admittedly, it has other advantages - file size, for instance).
Graphics, on the other hand, seem to be more important to the majority of people. So it probably made sense to Commodore upgrade the Amiga graphics chipset to AGA and leave the Paula as it was - thereby saving money and keeping the price of their machines lower than the competition.
Look at how many people today buy large HD Televisions, but never buy the surround sound audio system to accompany it. They are fine with the TV's little built in speakers. Perhaps humans are more visual than audatory.