Amiga.org
Amiga computer related discussion => Amiga Hardware Issues and discussion => Topic started by: Jose on August 04, 2006, 01:30:31 PM
-
Why is it that image scaling artifacts are more evident in LCD's then plasma screens ? Shouldn't it be the same since the image is being upscaled to a different aspect ratio in both cases ? And most plasmas have a higher contrast ration, wich can can be more than the triple of LCD's, so artifacts should be even more visible on plasma.
Anyone ?
Done a google search but found mostly garbage, then thought it would be a nice topic here...
:pint:
-
What kind of artifacts, specificly?
-
It can depend on the quality of the scaler hardware built into the set. So some LCD's may be better than some plasmas.
Generally though because LCD's are so much sharper you tend to notice the artifacts more, even on a smaller set.
-
Generally though because LCD's are so much sharper you tend to notice the artifacts more, even on a smaller set.
That would be my thought. Plasma has more of a glow (which looks softer) than a backlight with individually shaded cells (LCD).
-
If there's one thing that really annoys me about the "new" display tech', its "Native Resolution"!
What's the point of awesome(tm) display quality when the input image data comes in 20 different formats, which have to be upscaled, downscaled, stretched and squished to match the device's -native- resolution, in the process the image quality gets mashed and warped to sh*t?
-
Most LCD's have a 'soften' slider bar option which can soften up the scaled image a bit - I found this helped to hide the scaling aftifacts on LCD.
-
@coldfish
That's exactly what I think.
-
DLP 4 ME
-
@lou_dias
Same problem, a whole bunch of different signals incompatible to the native resolution that have to be scaled (read screwed up).
-
Jose wrote:
@lou_dias
Same problem, a whole bunch of different signals incompatible to the native resolution that have to be scaled (read screwed up).
All I know is on my 19" LCD with a native of 1280x1024, when I go to 1280x768 (only vertical is affected) I actually lose horizontal clarity on the monitor...then I plug it into my RGB-in on the DLP and it looks perfect. WTF?
DLP is a bit more natual as it's projected light on mirrors. It seems to produce a pretty natural picture and I watch DTV all day long and DVD's and my Gamecube in progressive scan and ANALOG TV.
-
@Everyone
I'll tell you what I think the problem is......and this is JUST my opinion!
the Problem is Microsoft!........huh? you say?
thanks to good ole Windows and their Stupid retarded non standard Video resolutions.................DVD and movies and anything with normal Video resolutions dont look right, unless they conform to VGA type crap!
in other words 720x 480, 1440x960 and so on Versus 640x480, 800 x 600, 1024x768 (windows VGA res)
HDTV of course is a whole differet thing, but I believe all these devices conform to those Windows VGA resolutions which are wrong! for video.......Microsoft couldnt go to video.......so they made video go to them.
so now instead of being able to download true video clips (352x240, 720x480) you now have to be forced to use fake (320x240, 640x480) crap and then stretch it for it to look normal or degrade it.
by the way.I had an INFOCUS XL-1 DLP projector and it looked great with an Amiga via S-Video.
-
without speaking that on win, you have dedicated drivers, driver calibration tools and such.
since 2 days i'm on my nephew' 15" TFT (hes on holyday now so i've stol... erm tested it on my A4000).
I don't like it.
-
leirbag28 wrote:
the Problem is Microsoft!
Wasn't the VGA standard defined by someone else (VESA springs to mind?) My understanding is that graphics cards support specific resolutions such as 320x240, 640x480 etc. and Windows naturally uses these resolutions. As much as I dislike Microsoft I don't think it's fair to blame them for this.
-
Just out of interest, is the picture quality any better if the scaling is a square factor?
Say for example, you have a 1600x1200 LCD, and you display an 800x600 screen on it. Surely the pixels would just be quadrupled and you should have a pixellated (obviously) but crisp display?
I've never really used a LCD properly myself but I have seen the blurring that happens when you use non-native resolutions. While they can be very nice in their native resolutions, I'm very happy with my CRT - it deals with everything I throw at it very well.
I am using a 19 inch Sony Multiscan Trinitron G400 with my Amiga 4000 - while it weighs an absolute ton and is huuuuge, it can display the outputs of my Cybervision PPC and scandoubler very well, from those low-res 320x256 games all the way up to 1600x1200. :-D
-
I am using a 19 inch Sony Multiscan Trinitron[...]
You mispelt "Herniatron". :-D I've carted around lots of monitors, and I always cringe when I see a Sony monitor that needs to be moved.
Anyhow, I think the more modern LCD high-resolution displays do a lot better job than the older ones. A friend of mine has one of the new Dell 21" flat panels LCDs, and I must say, it's beautiful in just about any resolution. The first LCD that I'd consider an improvement over my ViewSonic GS815.
-
You mispelt "Herniatron". :-D I've carted around lots of monitors, and I always cringe when I see a Sony monitor that needs to be moved.
No kidding!! :-o I had to carry the damn thing up two flights of stairs to my attic-bedroom! :cry:
-
Correct. These standards predate Windows, well, recognisable versions - lets not forget Windows 1.0 was around in 1985, and VESA wasn't established until 1989, to standardise support for 800x600 displays.
And even if MS had a bearing on it, could you really expect anyone to predict that 15 years later we'd be watching movies on a computer?
maffoo wrote:
leirbag28 wrote:
the Problem is Microsoft!
Wasn't the VGA standard defined by someone else (VESA springs to mind?) My understanding is that graphics cards support specific resolutions such as 320x240, 640x480 etc. and Windows naturally uses these resolutions. As much as I dislike Microsoft I don't think it's fair to blame them for this.
-
@alewis
Quote:
And even if MS had a bearing on it, could you really expect anyone to predict that 15 years later we'd be watching movies on a computer?
-----------------------------------------------------------------
Yes! absolutely! I got one word: AMIGA..... since 1985 it was known immediately!
problem is..IBM and all other machines were already conforming to COmputer screens and saw the Amiga as a Toy! Big mistake........look at computers today! doing what Amiga was doing in 1985 - 1990's
it's their own fault that they struggle with video..they lacked Vision.
Also, even though Microsoft didnt invent the VGA standard...they most certainlky popularized the screen modes.
even Mac resolutions were different and closer to video.
Computers are becoming what Amigas were...............Funny!
PC's would have loved to become what Amigas were even in the 1980's you can see the attempts...............
just watch the COMPUTER CHRONICLES tv series.
-
problem is..IBM and all other machines were already conforming to COmputer screens and saw the Amiga as a Toy! Big mistake........
VGA said goodbye to low resolution displays. Sure, you need additional hardware if you want genlock, composite video, etc. But many Amiga users have additional hardware just for the sake of having a readable workbench (scandoubler/deinterlacer, graphics card)
Also note that 640x480, 800x600, etc. have square pixels. 720x480 comes from hardware designers being cheap and using one crystal to derive both the pixel clock and the NTSC color carrier, the outer edges of that size image are not even visible on most TVs.