... probably just slowed down the stopwatch for measuring the fps... The 28 MHz clock drives the chipset and the CIAs housing the system timers. If the CPU clock doesn't change (like it wouldn't in a 4000) it appears to run faster because of the time warp.
IIRC, it was Redrumloa.
I was curious about his claim and wrote a PPC based timer that would measure a delay using GetSysTimePPC(), which is independent of the EClock. It measured (from the PPC side) EClock based delays, after calibrating for context switch overhead and such and got him to test it. It definitely seemed that it introduced a "time dilation" effect in most 68K benchmarking tools - any operation taking constant time would require fewer EClock ticks as the system clock was slowed down. Since the software assumed the rate was unchanged, this caused a corresponding increase in the reported performance.
This doesn't discount his original claim that some things got faster, but it definitely casts a question over the quantitative results.