Welcome, Guest. Please login or register.

Author Topic: How exactly does the Amiga RTG system work?  (Read 1397 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline gazspTopic starter

  • Jr. Member
  • **
  • Join Date: Sep 2004
  • Posts: 71
    • Show only replies by gazsp
    • http://realitydesign.asn.org.uk
How exactly does the Amiga RTG system work?
« on: October 28, 2005, 10:35:23 AM »
I was just wondering, when you use a graphics card with a big box Amiga, how exactly does the system know how to use the card?

Does graphics.library just get patched so that all OS calls go to the cards library / drivers instead of using the native chipset?

I'd just like to know has I don't have much understanding of the Amiga's RTG system.
 

Offline Piru

  • \' union select name,pwd--
  • Hero Member
  • *****
  • Join Date: Aug 2002
  • Posts: 6946
    • Show only replies by Piru
    • http://www.iki.fi/sintonen/
Re: How exactly does the Amiga RTG system work?
« Reply #1 on: October 28, 2005, 11:25:31 AM »
@gazsp
Quote
how exactly does the system know how to use the card?

User selects graphics card screen mode for desktop and/or applications.

Quote
Does graphics.library just get patched so that all OS calls go to the cards library / drivers instead of using the native chipset?

graphics.library gets patched indeed, but only to make it possible to use the standard functions on chunky 8bit and truecolour screens. Typically original chipset modes remain functional (both CGX and Picasso96).
 

Offline gazspTopic starter

  • Jr. Member
  • **
  • Join Date: Sep 2004
  • Posts: 71
    • Show only replies by gazsp
    • http://realitydesign.asn.org.uk
Re: How exactly does the Amiga RTG system work?
« Reply #2 on: October 28, 2005, 11:53:05 AM »
Quote
User selects graphics card screen mode for desktop and/or applications.


I know that much :-) Is it just the files in Devs/Monitors that inform the system of stuff like where the screen memory is located, what format the data is in (i.e. planar, chunky) etc. ?

I just don't really understand how it all gets set up initially in software so that the system knows to stop using (reading and writing to / from) planar Chip RAM, and start using the memory / GPU on the graphics card.
 

Offline Piru

  • \' union select name,pwd--
  • Hero Member
  • *****
  • Join Date: Aug 2002
  • Posts: 6946
    • Show only replies by Piru
    • http://www.iki.fi/sintonen/
Re: How exactly does the Amiga RTG system work?
« Reply #3 on: October 28, 2005, 12:25:25 PM »
@gazsp
Quote
Is it just the files in Devs/Monitors that inform the system of stuff like where the screen memory is located, what format the data is in (i.e. planar, chunky) etc. ?

Well, not really, as the OS itself has no knowlege of such things. It only knows about planar stuff. However, CybergraphX or Picasso96 patch the system so that if gfx card mode is used, correct things are displayed. All such details as the framebuffer address and size are internal to the graphics card system software.

Quote
I just don't really understand how it all gets set up initially in software so that the system knows to stop using (reading and writing to / from) planar Chip RAM, and start using the memory / GPU on the graphics card.

There is a good reason you don't: It doesn't. Only OS friendly apps using proper OS calls to do the rendering work. Any HW banging apps will always use the native graphics output, or display only partially on gfx card output (missing parts in GUI for example).
 

Offline Karlos

  • Sockologist
  • Global Moderator
  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 16879
  • Country: gb
  • Thanked: 5 times
    • Show only replies by Karlos
Re: How exactly does the Amiga RTG system work?
« Reply #4 on: October 28, 2005, 05:10:25 PM »
In my not so humble opinion, existing RTG on 3.x is a big fat, dissatisfying hack. Of course, it has to be in order to work, as Piru points out, the graphics.library, layers.library etc were designed around the original hardware. The RTG software patches all the critical stuff that opens screens, allocates bitmaps etc and perform rendering on it. Completely OS legal / no hw banging programs will generally work fine unter RTG.

What irks me about RTG under 3.x is that many calls that could be accelerated by the GPU on your graphics card by the driver don't appear to be if many of my old experiments are anything to go by. Apart from basic blitting and block fills, that is. Most GPUs presently used in amiga graphics cards offer a bit more than just that and the graphics.library has several routines that would make good candidates, eg BltBitMapScale(), or colourspace conversion when blitting between different colour formats (provided both BitMaps are in memory addressable by the GPU, that is) for instance.

Another problem with the 3.x OS routines at least, is that they are tied to an 8 bit view of the world. When you have a 16-bit RGB screen or better, the notion of dealing with ObtainPen() etc is just irritating beyond belief.

CGX/P96 both allow you to get hold of the pixel representation of a chunky/RGB bitmap and do low level custom rendering. Since they provide this ability, what they really missed out on was the potential to build upon that and provide an alternative low level graphics interface for games and multimedia apps that can be fully hardware accelerated (stretched blits, transparent blits, primitive rasterization etc).
int p; // A