Welcome, Guest. Please login or register.

Author Topic: Of FPGAs and the way forward  (Read 6766 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline alexh

  • Hero Member
  • *****
  • Join Date: Apr 2005
  • Posts: 3644
    • Show all replies
    • http://thalion.atari.org
Re: Of FPGAs and the way forward
« on: August 05, 2008, 08:35:01 AM »
Quote

SamuraiCrow wrote:
Hardware engineering is always more parallel [snip] and only resort to serialization when you run out of space on the die.

Bollox.

Quote

SamuraiCrow wrote:
The software that controls [FPGAs] is designed to convert serial programs into parallel programs whenever possible but

Again not true. This software you talk of does not control. It creates the contents of the FPGA. And it does not convert serial programs into parallel programs. (Not quite sure where you got that idea from.) The high level languages used to define the contents have sequential (serial) and concurrent (parallel) statements where the designer can choose serial or parallel. The Synthesis tools do not (or at least, very rarely) try to change this.

Quote

SamuraiCrow wrote:
at the same, maintain synchronization with the other parts of the chip by counting clock cycles.

Yup, static timing analysis... all done over several minutes while creating the image to load onto the FPGA.

Quote

SamuraiCrow wrote:
The type of parallelism that takes place on an FPGA is much more flexible than parallel processors because, as long as you've got space on the chip, you can expand the bus width to 128 bits or narrow it to 1 serial connection depending on the data type.

Not real-time you can't.

You seem to be talking about reconfigurable hardware using FPGA's. While this has been discussed and experimented with over the years. The devices to support partial reprogramming do not commonly exist. (Although there are some on the market).

You could not reprogram the entire FPGA in real time as it currently takes many many hours to re-calculate complex designs. You have to cut the design into manageable hierarchical structures which can individually be reconfigured. However if the reconfiguration exceeds the resources previously owned by that block... you're screwed. You need to have worked out the sizes of all possible reconfigurations before you implement the overall FPGA.

And as I said... most devices do not support partial reconfiguration. Not to mention holding the current data in the pipelines.

BUT: It's very interesting stuff... undoubtedly the future of some areas of electronics.

Quote

SamuraiCrow wrote:
The and/or/invert logic in the blitter's bit-masking mode is the same bit-twiddling technique used in the FPGA!

At the lowest possible level.

Quote

SamuraiCrow wrote:
Now, by practicing the custom bit-masking modes of the blitter, software engineers can learn to become hardware engineers and hardware engineers have a more important role. Hardware engineers can take the bit-masking modes of the blitter programmers and make them into custom chips thus completing the cycle.

Erm, I don't think so. :-)

The overheads of reprogramming will be (for a long time) worse than just doing it in software.

What would be better is if the software engineers knew how to use the profiling tools that come with their system, they profile where the code is spending most of it's time, and work out what "acceleration" hardware they would like... and then reconfigure some "spare" logic to do this task. Reconfiguring at the data-path level is not practical yet.

Edit: Just read the slides. Interesting how it brushes over the reconfigurability aspect.

P.S. I think I had a board like the one in those slides years ago. We're all Virtex4 in our office now.
 

Offline alexh

  • Hero Member
  • *****
  • Join Date: Apr 2005
  • Posts: 3644
    • Show all replies
    • http://thalion.atari.org
Re: Of FPGAs and the way forward
« Reply #1 on: August 07, 2008, 10:18:10 PM »
Quote

SamuraiCrow wrote:
On the NatAmi there are 16-megs of Chip RAM built in to the FPGA.

Yeah right... NOT!

An FPGA with 16-Mbytes of on chip SRAM would cost more than the gross debt of Northern Rock!
 

Offline alexh

  • Hero Member
  • *****
  • Join Date: Apr 2005
  • Posts: 3644
    • Show all replies
    • http://thalion.atari.org
Re: Of FPGAs and the way forward
« Reply #2 on: August 07, 2008, 10:27:18 PM »
It's gotta be external. The biggest FPGA's today only have about 3Mbytes SRAM and they cost about $10,000 each.

You have to take everything written by the Natami wannabe "Gunnar von Boehn" as bollox.

Only Thomas Hirsch knows what is really going on.