Welcome, Guest. Please login or register.

Author Topic: CGX emulation under P96 (not bad after all)  (Read 931 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline KarlosTopic starter

  • Sockologist
  • Global Moderator
  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 16867
  • Country: gb
  • Thanked: 4 times
    • Show only replies by Karlos
CGX emulation under P96 (not bad after all)
« on: April 24, 2004, 02:20:30 PM »
Hi!

It's saturday so that must mean another odd coding question from yours truly.

I'm having some odd incompatibilites with some CGX code running under P96.

The code in question simply uses the CGX API to query a normal BitMap to get some properties. In particular, the Pixel format, modulus etc. Only CGX v3 features are used in an effort to stay P96 compatible.

This information is then used to direct code that renders directly into the BitMap using LockBitMapTags().

The code simply converts a rectangular block of pixels from any one CGX defined pixel format to any other. This is used for "on-the-fly" pixel conversion from one place to another (typically a fast ram buffer in a sensible format to the video ram's surface in some bizzare HW dependent format).

On my CGX installation the code works perfectly. I can take a rectangular block of pixels in e.g. 32-bit ARGB format and apply it to my BitMap which is in some device dependent format determined by my display (say 16-bit RGB 565) and the code determines this from the queried data and calls the optimal ARGB32 -> RGB565 conversion routine.

I could just as easily take data in some esoteric format e.g. little endian 15-bit BGR and do the same thing as above and the appropriate conversion is called.

The point is, any supported pixel format in -> correct pixel format out. It's transparent to the caller.

I've tested the routines pretty exhaustively and they all work just fine on 2 different CGX systems (BVision and a CV64).

Next, I tried it under P96 under UAE and on a Mediator/Voodoo3000 combo. Both produce wrong results. Specifically it seems 2 things occur

1) the modulus reported by the emulated cgx for a 16-bit BitMap seems screwed up somehow, causing a diagonal hash.

2) in 32 bit modes, the modulus is correct *but* the ARGB values on screen appear transposed from those reported.

One thing I noticed about (2) is that p96 may support some 32-bit pixel formats that perhaps CGX does not. I don't recall seeing 32-bit ABGR under CGX, but that's not what P96 is using for 32 bit modes here - it's using BGRA32, which is supposedly supported in CGX too.

So, has anybody else encountered similar problems and what did you do to remedy it?
int p; // A
 

Offline Thomas

Re: Bad CGX emulation under P96
« Reply #1 on: April 24, 2004, 07:04:49 PM »

Why do you convert manually ? If both bitmaps are deeper than 8 bit, you can just BltBitmap() from one into the other and CGX/P96 does the conversion for you.

Bye,
Thomas

Offline Piru

  • \' union select name,pwd--
  • Hero Member
  • *****
  • Join Date: Aug 2002
  • Posts: 6946
    • Show only replies by Piru
    • http://www.iki.fi/sintonen/
Re: Bad CGX emulation under P96
« Reply #2 on: April 24, 2004, 07:52:42 PM »
@Thomas

I guess he preconverts the gfx so that the actual blitting (which might happen a lot) doesn't need to convert anymore.
 

Offline KarlosTopic starter

  • Sockologist
  • Global Moderator
  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 16867
  • Country: gb
  • Thanked: 4 times
    • Show only replies by Karlos
Re: Bad CGX emulation under P96
« Reply #3 on: April 24, 2004, 08:30:10 PM »
@Piru

That's one of the reasons, amongst others. I like to be able to get things into device depentent format for rapid harware manipulation, especially blitting etc.

The code forms part of the graphics library code of my framework. I have a class "Surface" which specifically represents a rectangular pixel array in video ram. Under the amigaos implementation, this is an OS BitMap that is a friend of some existing visible BitMap, hence we have no real say it it's exact format.
Another class in the same library is "ImageBuffer", that simply encapsulates an array of pixel data that would typically be stored in normal memory and allow direct manipulation. The ImageBuffer class is in the common part of the source and hence does not encapsulate a BitMap, rather it just wraps a block of normal memory.

You can paint your ImageBuffer onto a Surface using Surface::putImageBuffer(image, sx, sy, ix, iy, w, h), where sx/sy = position on the Surface, ix/iy = position within ImageBuffer and w/h define the rectangle to draw.

An example application the above follows:

A simple paint demo built on the framework (source code here), we have a canvas which is eg a 32-bit ARGB ImageBuffer. We draw on this and then paint the changed area onto a Surface obtained from our display.

Naturally, the latter is probably not in the same format as the former and the appropriate conversion happens when we call putImageBuffer().

It all works perfectly under CGX using the query methods and direct bitmap access. It also works under p96, the results are just screwed up :-(

-edit-

@Thomas

Incidentally, there is also Surface::putSurface() method that is used in the same way as the putImageBuffer(). This method does use BltBitMap() internally ;-)
int p; // A
 

Offline KarlosTopic starter

  • Sockologist
  • Global Moderator
  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 16867
  • Country: gb
  • Thanked: 4 times
    • Show only replies by Karlos
Re: Bad CGX emulation under P96
« Reply #4 on: April 26, 2004, 03:04:42 PM »
 :bump:

All fixed :-) It was actually my fault and nothing to do with Picasso96's CGX emulation after all :oops:

The modulus bug was a rogue "addq #1 ..." in the one specific ARGB -> RGB565PC conversion and the colour transposition bug for BGRA was a misinterpretation of the pixel format due to a wrongly mapped value in my code.

Neither of these bugs cropped up under normal CGX systems because neither of the target modes were supported by the gfx cards tested.

So, does anybody here actually know why CGX doesn't support 32-bit ABGR pixel formats? Seems odd that only 3 of the 4 obvious 32-bit formats (all of which are supported by P96) are supported.
int p; // A