Welcome, Guest. Please login or register.

Author Topic: Coldfire - Binary Compatible  (Read 7713 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline Louis Dias

Re: Coldfire - Binary Compatible
« on: January 30, 2008, 01:55:31 PM »
I'm pretty sure that when I looked up Coldfire tools that Freescale had a binary analyzer/converter that could take a 68K binary and make it CF-native.

Regardless, perhaps the first step should be recompiling 68k AROS to Coldfire.
 

Offline Louis Dias

Re: Coldfire - Binary Compatible
« Reply #1 on: January 30, 2008, 10:53:44 PM »
Why couldn't a 68K be virtualized and then fed the binary, then the virtual68k could make the determination and in the background, the app is repackaging the binary in a Coldfire-compatible way.  The resultant binary could then be run natively.

Or someone could make a 68k->Coldfire compiler...or a 68k dis-assembler->CF instruction converter->Coldfire re-assembler->Colfire compiler.
 

Offline Louis Dias

Re: Coldfire - Binary Compatible
« Reply #2 on: January 31, 2008, 03:27:38 PM »
Quote

bloodline wrote:
Quote

Piru wrote:
@bloodline

As far as I can tell he didn't, he wants separate, native binaries:
Quote
The resultant binary could then be run natively.


Yes you are quite right, he did imply that.

I really can't understand the obsession many on this site have for the coldfire... If I was starting a project, I choose either an ARM or a x86... depending upon the application.

Yes that's what I meant.
Sort of like a JIT, but not actually executing code, just analyzing it's execution, marking offsets and translating into CF compatible version.  If one 68K instruction has to be emulated by 4 CF instructions, then all jump/branch offsets have to be moved up 3 bytes/words thereafter.

The "obsession" with 68K/CF is that if Commodore were to make an A5000, it would be running a Coldfire cpu.  People are looking for a true upgrade path along the "classic" hardware lines.

Have you seen the NatAmi board with the AGA+ chipset?