@DavidF215 & fishy_fiz
GCC for PPC code generation is not nearly as bad as 68k. The newest 4.x versions of GCC still does not generate optimized code for 68k. Some of the newer versions are worse.
What type of impact does this lack of optimization cause? I've read a little about it, but haven't read its impact. Impacting graphics performance, I/O performance, data processing performance, etc? For example, a game--what is significantlly impacted such as blitting, game logic, sound, network and at what processor level is it seen? What processor level, if any, would no longer notice the performance hit?
GCC 2.9.5 may generate the best 68k code. VBCC and SAS/C generate better code in most cases. VBCC is still being updated with Amiga support which includes an awesome optimizing assembler. Frank Wille has some other developing tools he is supporting as well. Thomas Richter is actively supporting his Mu (mmu) tools.
I have SAS/C, I think. I was using StormC3 for a while instead.
I am making a much improved version of the ADis disassembler that is already very useful. We have some of the low level tools but some of the high level tools are very sophisticated and require a lot of time and skill to perfect.
I stopped at C/C++. Tried to learn assembly but decided it was too much work.

Making a cross platform C compiler which supports most CPU's is a huge project much like these multi platform web browsers that we don't have.
Yes, I've experienced the cross platfom development can be a pain, especially deciding what dev tools to use.
I think we are gaining some momentum and users back. If we could get a fpga solution like Natami or MiniMig+ with AGA and 020+, we might have something.
This is what I'm watching, too. If a MiniMig+ with AGA (RTG maybe?) hits I might buy one to replace my aging A1200.
I've been wondering if developing for 68k would be adequate enough for most Amiga systems. It's emulated on faster machines and native on 68k hardware, so it seems a decent fit for all. Graphics seem to the the sticking point.