Welcome, Guest. Please login or register.

Author Topic: SCANLINE EFFECTS...  (Read 2714 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline Floid

  • Hero Member
  • *****
  • Join Date: Feb 2003
  • Posts: 918
    • Show all replies
Re: SCANLINE EFFECTS...
« on: September 25, 2003, 05:30:43 AM »
The concept you're looking for is termed 'vsync' in the 3D gaming world, I think.

'Hardware' mouse pointers - as found on my venerable Matrox G200, and many accellerated 2D chipsets before and hence - should do it, unless the chip itself was designed poorly.  For window dragging or animation, it's harder to say, and/or the fault of various graphics APIs.

Violent window shaking won't "tear" the window itself on my XFree86/G200 (and XFree86 is a sort of worst-case for these issues), but of course the contents corrupt temporarily if I move faster than the contents can redraw, and I can paint some nice 'trails' on windows behind that are slow to redraw.

I gather it can be a bit of a pain to write a (non 3D/direct-rendered) X11 application that can synchronize as we'd like (not sure if it's 100% impossible, there's at least the double-buffer extension)...  This probably should be less of a problem on Windows, but I have no idea what's up there, and knowledge of best-practices doesn't travel quickly between Windows developers.  (Read: Even if it's doable and easy, do you expect the author of eRecipeDatabase 2000 to have known it?)

Judging from the "Digital Glitch" complaint, the problem with the public versions of Elate/Intent/DE was that the interface through the graphics layer of the host OS didn't offer any/enough opportunity for synchronization.  Plus, some people may have found other, weirder raster tricks on the Amiga to ensure graphics would flow smoothly, probably akin to the motion-predictive compensation now being studied to deal with the problem of animation on LCDs and other displays with 'full-on' pixel duty cycles.

(Yes, Virginia, interlace and CRT flicker can actually help render motion 'smoothly,' though the "slow phosphor" displays used to compensate for yesterday's scan rates negated most of the benefit.  (Assuming no jitter, wiggle, or whatever you'd call it between scans, interlace at, say, 200Hz should be as good as progressive scan at 100Hz, while actually having an advantage at rendering motion, ####uming the phosphor decay times were tweaked appropriately.  Faster, more rapidly decaying scan-flashes of light per-pixel would leave less of a 'trail' on both the display phosphor and your retina.))
 

Offline Floid

  • Hero Member
  • *****
  • Join Date: Feb 2003
  • Posts: 918
    • Show all replies
Re: SCANLINE EFFECTS...
« Reply #1 on: September 25, 2003, 06:03:00 AM »
The 'scanline' of course, is only a concept - it's the line drawn by the magnetically-swept *electron beam* that moves across the display, lighting one pixel at a time per frame.

What you see on poorly-synchronized videotape is an artifact or interference pattern of sorts, created by the camera's superhuman 'awareness' of the decay in brightness of the phosphors on the screen, modulated by the differences in both synchronization (when the camera starts sampling its frame, or field if it's interlaced) and rate (camera's at fractions of 60Hz, say, while display is at 75Hz) between the camera and the display.

If LCDs don't have the problem, it's for a few reasons -

-They're usually lit by fluorescent/cold-cathode sources, and today's run at high frequencies for energy efficiency, plus the bulbs have 'long' phosphors to even out any flicker.

-In turn, today's camcorders and the like have 'shutter speeds' or scanrates optimized not to turn the image to a flickery mess even in areas lit with crappy, oldschool fluorescents.  With a film camera, you'd just shoot at 1/30th of a second or slower and hope it worked out.  (I think a few consumer digicams even have modes specifically for taking snapshots of CRTs, and they might be able to do some sampling/synchronization tricks in the fraction of time between jabbing the shutter and capturing the image.)

-Some displays might use 'white' LED backlights run off DC anyway.

-TFT turns out to be pretty much full duty-cycle (I have to admit, I'm hazy on this), with a few caveats:

 -Full duty cycle turns out to suck for motion, see above.

 -Doing some research, apparently a lot of displays? 'dither' the signals for broader color resolution, so the current to a particular TFT subpixel (R,G, or B) may actually be running at some ridiculously high frequency, with duty-cycle varied to produce variations in brightness between 'off' and 'on.'  Now, LCD material itself is supposed to be slow to respond, so I don't know if this means the pixel is actually 'pulsing on and off' at like 10MHz (to pull a random number out of my ___) to dither your particular shade, or more likely, that, y'know, N% of the crystals are aligned one way or the other, and they might 'wobble' a bit but end up letting through a fairly even intensity for the duration of the scan.

 -What the heck happens when you run an LCD with an analog input?  Okay, I gather most *old* analog LCD monitors were dual-scan or other designs that somehow more closely approximated the CRT way of life that our analog video signals were invented for...  But now that just about every desktop display is a TFT, even those that only ship with analog VGA inputs, what's going on there?  I would've ####umed the control logic would be simple, and make each pixel run like a CRT pixel, *not* at a full duty cycle, but that would have some really weird implications for brightness (especially when you're designing the same panel to support digital DVI/LVDS input), so I suspect there's some ridiculously complex digital-sampling mojo going on here.  So the analog VGA signals are probably getting sampled into a digital framebuffer, and then displayed as a 'static' TFT image... though that framebuffer itself is RAM, and has to be 'scanned'... or even if it's SRAM, then the pixels will still *update* as the bits get flipped by the incoming analog signal, so there's still a scan rate at play, but without any ####ociated phosphor decays.  (Think of watching one of those electromechanical signboards they use on highways or buses flipping its elements around.)

It makes my head hurt.
 

Offline Floid

  • Hero Member
  • *****
  • Join Date: Feb 2003
  • Posts: 918
    • Show all replies
Re: SCANLINE EFFECTS...
« Reply #2 on: September 25, 2003, 06:13:00 AM »
I can't stop snickering about ####uums, now.  (Why is it always *four* characters?)