Amiga.org
Amiga computer related discussion => Amiga Software Issues and Discussion => Topic started by: AmigaMance on November 11, 2009, 03:39:31 AM
-
I remember that it had become a standard on PC's and it was called High Color, or something? Neither CGX or P96 support 18bit modes. You go straight from 16 to 24bit modes.
I think it would have been a good compromise between speed/memory usage and image quality.
-
I remember that it had become a standard on PC's and it was called High Color, or something?
I don't. I do however remember that the VGA palette was 18bit.
I think it would have been a good compromise between speed/memory usage and image quality.
How exactly?
To get 18bit you'd still need to have 3 bytes per pixel, or use some kind of weirdo mode where the 6 extra bits would always be used for the next pixel. That'd be prohibilitely complex and slow to work with.
-
I don't. I do however remember that the VGA palette was 18bit.
I remember reading some pc game reviews, back in the early 90's, where some games were using 18bit screens and the editors called it high color. There is a possibility that i'm totally wrong about it.
How exactly?
To get 18bit you'd still need to have 3 bytes per pixel, or use some kind of weirdo mode where the 6 extra bits would always be used for the next pixel. That'd be prohibilitely complex and slow to work with.
Ah, ok. I just followed the simplistic reasoning "less colors, more speed".
-
You'd be amazed at what the VGA chipset can actually do - it's not called the Versatile Graphics Array for nothing. However, most of its fancy features were never used because the implementations varied so much from card to card, I'd imagine. But if you knew the chipset you were working with rather than relying on UniVBE or something, you could do great things, including, I suspect, changing the palette on the fly... which would give you an 18-bit screen if the palette is 18 bit. Of course for that you'd have to write a display routine to switch palettes - no copper on VGA, after all - so your program would be dog slow. Good for text adventures and the like though, I'd imagine... some 8-bit computers did that, top of the screen in one screen mode for the picture, then switch to text mode for the input/output.
-
Hi,
I remember that VGA chips such as "VGA Paradise" from WesternDigital had a 18 bits palette (+262000 colours) but it can only display 256 colours (8 bits).
-
I've always thought hicolor meant 16bit - 64K colors ?!?
-
You'd be amazed at what the VGA chipset can actually do - it's not called the Versatile Graphics Array for nothing.
Erm - VGA stands for Video Graphics Array. And there have never been 18 bit deep modes...
Plus: 15 or 16 bit HiColor modes are not palettized, of course - imagine a 48 KByte RAMDAC (single chip!) running at ~100 MHz in 1990... HiColor and TrueColor modes work by bypassing the RAM part and feeding directly into the DAC part.
Imho VGA was the first usable video standard on the PC platform - MDA, CGA, Hercules and EGA were total crap when it came to graphics. When it came to 'magic' like changing display properties on the fly (like Copper does), VGA was inferior to Amiga graphics by far. However, later VGA chips added so much performance that it became unnecessary to play with the modes...
-
I remember that it had become a standard on PC's and it was called High Color, or something? Neither CGX or P96 support 18bit modes. You go straight from 16 to 24bit modes.
There actually was a "fake" 18-bit mode used in quite a lot of PC-demos in the mid-90's. This was basically achieved by setting the palette to include 64 shades of both red, green and blue, and the drawing the RGB-components on different lines, which with a high enough vertical resolution kind of blended into 18-bit colors. Probably best illustrated by this image: http://www.oldskool.org/demos/explained/explained/htmlpictures/18bitori.gif
This was ofcourse not a hardware graphicsmode, but a programmed one.