Welcome, Guest. Please login or register.

Author Topic: Of FPGAs and the way forward  (Read 6687 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline SamuraiCrowTopic starter

  • Hero Member
  • *****
  • Join Date: Feb 2002
  • Posts: 2281
  • Country: us
  • Gender: Male
    • Show all replies
Of FPGAs and the way forward
« on: August 05, 2008, 04:04:55 AM »
This post by me from the other Natami thread is what started this thread:

Quote

The Natami is pushing the hardware limits but it will only be able to run OS 4+ if they add a PowerPC accelerator card to it. Before the Mac went to Intel the PowerPC was a vibrant design with lots of attention and now it is relegated only to game machines which are restricted by hypervisors so that they don't allow ordinary people to experience their full potential.

I think PowerPC is a dead-end as much now as ever and the Intel and AMD processors are reaching a dead end also due to the heat restrictions of higher clock frequencies and are turning their attention toward multiple cores. Since their software doesn't run well on multiple cores they are going to be at a standstill.

I think the Intels will make it further than the PowerPCs because they have a more compact instruction set but that will wear thin when easier-to-use instruction sets prevail. I think the way of the future is asymmetric multicore design where the cores are dedicated to the functionality they are intended for.

What makes an Amiga, in my opinion, is the dual-bus architecture. While most computers are stuck with a left-brain dominant design that didn't allow for parallelism, the Amiga introduced a computer with a right hemisphere for creative thinking. On most computers it may be considered a required peripheral for graphics and sound, but on the Amiga it was standard, integrated, and elegant. On the stock A1200 it was even right-brain dominant due to the underpowered main processor.

I could say more but if you want to hear more of why I think the way I do, I'll start a new thread.


So here I go:

Most software is serial with branches.  Most hardware is parallel but controlled by a processor.  Now the industry expects software engineers to develop parallel applications and hardware engineers to take the lazy way out and make duplicate cores.  This can only lead to tears because they are asking hardware and software engineers to use the other one's work techniques.

Software engineering is supposed to be easier than hardware engineering because unique software is more common than unique hardware.  The symmetric multiprocessing path will be used in high-end applications but will be impractical for low-end applications until new programming languages come out that take advantage of parallelism.

Hardware engineering is always more parallel than software engineering because that's the way gate layout works on a chip.  It's always quickest to have many things done at once and only resort to serialization when you run out of space on the die.

What I'm proposing is a compromise:  FPGAs are programmable hardware and the software that controls them is designed to convert serial programs into parallel programs whenever possible but, at the same, maintain synchronization with the other parts of the chip by counting clock cycles.

The type of parallelism that takes place on an FPGA is much more flexible than parallel processors because, as long as you've got space on the chip, you can expand the bus width to 128 bits or narrow it to 1 serial connection depending on the data type.

The Amiga championed the way into the multimedia age by allowing a serial threaded processor to multitask and by making the most processor-intensive tasks run on a separate bus for all of the right-brain graphics calculations and sound mixing.

The next generation of hardware will start there with asymmetric cores like Intel's Atom and the Natami70.  These multicore chips have a processor, a graphics core, sound capabilties and have them all on one chip.  To keep down the costs they will have to cache the accesses differently from the dual-bus architecture of the Amiga but there is a unique characteristic that the Amiga chipsets had that isn't on the others:  The and/or/invert logic in the blitter's bit-masking mode is the same bit-twiddling technique used in the FPGA!

Now, by practicing the custom bit-masking modes of the blitter, software engineers can learn to become hardware engineers and hardware engineers have a more important role.  Hardware engineers can take the bit-masking modes of the blitter programmers and make them into custom chips thus completing the cycle.  Software makes new hardware and new hardware makes more efficent ways to make software.

For an example of software being translated into hardware, download this PDF Slideshow from the LLVM Developers' Conference that took place last Friday.  There's a Quicktime Movie to go with it if you can view it.
 

Offline SamuraiCrowTopic starter

  • Hero Member
  • *****
  • Join Date: Feb 2002
  • Posts: 2281
  • Country: us
  • Gender: Male
    • Show all replies
Re: Of FPGAs and the way forward
« Reply #1 on: August 05, 2008, 01:25:43 PM »
@alexh

I never said configuring the FPGA in realtime was a reality.  I didn't know how long it took to do the timing analysis though.  What I was thinking of was more cores to add to the chipset could be implemented by converting some of the most frequently called subroutines in the OS to custom operations in the hardware.

@bloodline
The right-brain analogy was for the custom chips having their own memory bus.  On the NatAmi there are 16-megs of Chip RAM built in to the FPGA.  It is used as local-store memory for the custom chips and functions like a software cache for multimedia chip accesses.

Oddly, for whatever reason, the custom chips on the Natami can access the Fast bus and gain a huge amount of memory.  So whether it's fully a dual-bus Amiga remains to be seen.

Your analogy of the PC being like an Amiga holds very true.  Many graphics cards do have their own memory and are like Amigas in that aspect.  If the GPU were used for doing all of the sound mixing like is proposed on the NatAmi, your AROS system could be an Amiga.
 

Offline SamuraiCrowTopic starter

  • Hero Member
  • *****
  • Join Date: Feb 2002
  • Posts: 2281
  • Country: us
  • Gender: Male
    • Show all replies
Re: Of FPGAs and the way forward
« Reply #2 on: August 07, 2008, 08:30:42 PM »
Quote

bloodline wrote:
Quote

SamuraiCrow wrote:

@bloodline
The right-brain analogy was for the custom chips having their own memory bus.  


Well, all systems use separate busses for different parts of the system... that's nothing unique to the Amiga.

Quote

On the NatAmi there are 16-megs of Chip RAM built in to the FPGA.  


Is that an FPGA with 16megs?

Quote

It is used as local-store memory for the custom chips and functions like a software cache for multimedia chip accesses.


I don't understand... software cache?

Quote

Oddly, for whatever reason, the custom chips on the Natami can access the Fast bus and gain a huge amount of memory.  So whether it's fully a dual-bus Amiga remains to be seen.


Again, I don't understand... if the Custom chips can access the CPU local bus, then why bother with it?

Quote

Your analogy of the PC being like an Amiga holds very true.  Many graphics cards do have their own memory and are like Amigas in that aspect.  If the GPU were used for doing all of the sound mixing like is proposed on the NatAmi, your AROS system could be an Amiga.


Why not just use a DSP for the Audio mixing? Leave the GPU to do GFX stuff... perhaps?


The Natami is a hybrid between PC architecture and Amiga architecture.  It will still have 16 megs of static memory on the chip for use as chip memory but it will be many times faster than the fast-page memory used in the AGA machines.  It will function much faster than even the main memory on the motherboard.  That's how it will function as a local-store memory.  When I referred to it as a software cache, I mean that it will function much like a disk caching system works.  The recently accessed stuff will be in the 10 nanosecond chip memory for ready access by future operations.

If you used a DSP for sound then it wouldn't be a single-chip solution.  As it is, there will likely be some external memory chips on the system board but only because it is more configurable that way.  The idea is to have a mostly self-contained computer on just one chip.

As for the multi-bus architecture, it is only common for desktop models, single-chip solutions like the Intel Atom will use a bunch of backside caches and other tricks to make their systems work.  Since the Amiga is designed to run from 2 megs of chip RAM, there will be no conflict with using software to detect and use the chip memory manually on the Natami without wasting chip space on yet another cache controller.
 

Offline SamuraiCrowTopic starter

  • Hero Member
  • *****
  • Join Date: Feb 2002
  • Posts: 2281
  • Country: us
  • Gender: Male
    • Show all replies
Re: Of FPGAs and the way forward
« Reply #3 on: August 07, 2008, 10:25:44 PM »
At least that was the impression that I got.  Maybe it will use external SRAM but either way it's going to be fast for a 200 MHz chip assuming it gets that far.
 

Offline SamuraiCrowTopic starter

  • Hero Member
  • *****
  • Join Date: Feb 2002
  • Posts: 2281
  • Country: us
  • Gender: Male
    • Show all replies
Re: Of FPGAs and the way forward
« Reply #4 on: August 10, 2008, 08:19:55 PM »
Update from the Natami website regarding how the memory busses are connected:

Quote

Just like the original AMIGA, the NATAMI has two fully separate memory busses.

1) The CHIP-memory bus
2) The Fast memory bus.

The SuperAGA chipset has in addition to this a very small 3rd memory block inside the chipset. This 3rd memory block is a Sprite/3D cache and can be used to accelerate blitting and 3D operations.

The two buses are independent and the NATAMI can do two memory operation at the same time, one to each of the buses.
In addition to these two memory access on the two external buses, the NATAMI Blitter can do several memory operations per clock inside his local store.

The CHIP-memory is 64bit wide and is build from very fast pipelined, syncrones external SRAM.
The fast memory on the Natami60 is regular SDRAM.

A major strength of the original AMIGA design is its DMA capabilities. The Natami is fully compatible to this.
But of course the SuperAGA chipset is faster and can do 100 times more DMA than the original AMIGA chipset.

The original AMIGA could do Audio and Sprite DMA into Chip memory only. The original AMIGA could do SCSI DMA into fast memory.
This means the original AMIGA could do DMA to both memory busses but not fully freely.

The Natami improves this design. The Natami can do all types DMA to both memory banks.

From a programming point of view you can program the Natami liek any other AMIGA.
It good to use the chip memory as always to keep your audio and video data in it.

For best performance you should always store your heavy accessed video data in chip memory. But as  the Blitter can read from fats-memory as well you are more flexible in creating huge games as you can as well store information inside the bigger fast memory.

SuperAGA is many times faster than normal AGA out of several reasons.

a) Blitter and Chipmemory are 64bit wide.
AGA was 16bit wide.

b) Blitter and Chipmemory is much higher clocks. Natami Chipmemory can be clocked to 200 MHz. While the original AMIGA was only working at 3.7 MHz.

c) The Natami Blitter has a local store inside the chip which means if you blit the same sprite many times or draw the same texture several time you only need to read it once. Depending on your game engine this can up to quadruple the overall performance.


So the chip bus is off of the FPGA but there is a 32Kbyte local store for the 3D and vector acceleration portion of the new SuperAGA Chipset.
 

Offline SamuraiCrowTopic starter

  • Hero Member
  • *****
  • Join Date: Feb 2002
  • Posts: 2281
  • Country: us
  • Gender: Male
    • Show all replies
Re: Of FPGAs and the way forward
« Reply #5 on: August 14, 2008, 03:31:09 AM »
I posted that quote from the Natami website as a correction for my previous bad information.

What they said on the Natami website about the speed of it is that it should be comparable to the Wii or the Playstation 2.