Amiga.org

Amiga computer related discussion => Amiga Software Issues and Discussion => Topic started by: Ancalimon on September 25, 2003, 03:26:02 AM

Title: SCANLINE EFFECTS...
Post by: Ancalimon on September 25, 2003, 03:26:02 AM
As we all know, there is a scanline that moves in the monitor and each time it passes the whole screen, display is updated.

Problem is, this scanline effects the moving objects on screen: Like mouse pointer, solid moving windows, ... everthing.

On AGA usually scanline movement is timed so good that it seems to stay in one place not effecting the movement of objects, reulting in very smooth animation.

If scanline movement could be kept from moving rapidly on screen and stayed at the bottom of screen, this would not happen on RTG screens.

Here if I set my AGA output to a 60hz one, and also set my CGX refresh rate to 60hz and do a bit finetuning with CGXMODE, I get that smoothness again.
Title: Re: SCANLINE EFFECTS...
Post by: iamaboringperson on September 25, 2003, 03:34:42 AM
Quote
If scanline movement could be kept from moving rapidly on screen and stayed at the bottom of screen, this would not happen on RTG screens.
I can't really understand what you are on about...
Do you know how a raster display works?
Title: Re: SCANLINE EFFECTS...
Post by: Ancalimon on September 25, 2003, 04:02:25 AM
I didn't mean to say that.
If you stop scanline movement I know there will be no screen update. What i mean is that move the scanline at a rate so that it looks like its staying at one position. If you use a video camera on your monitor while its running you'll see the scanline clearly. By doing those things I mentioned in my earlier post, the camera will not detect the scanline now. And also movement of mouse pointer, windows, and everthing else have become smooth. Thay are not interrupted by the scanline. The scanline still exists I know it but it already reches the bottom of the screen while a new frame is sent to the monitor resulting in smoother animation.
Title: Re: SCANLINE EFFECTS...
Post by: Floid on September 25, 2003, 05:30:43 AM
The concept you're looking for is termed 'vsync' in the 3D gaming world, I think.

'Hardware' mouse pointers - as found on my venerable Matrox G200, and many accellerated 2D chipsets before and hence - should do it, unless the chip itself was designed poorly.  For window dragging or animation, it's harder to say, and/or the fault of various graphics APIs.

Violent window shaking won't "tear" the window itself on my XFree86/G200 (and XFree86 is a sort of worst-case for these issues), but of course the contents corrupt temporarily if I move faster than the contents can redraw, and I can paint some nice 'trails' on windows behind that are slow to redraw.

I gather it can be a bit of a pain to write a (non 3D/direct-rendered) X11 application that can synchronize as we'd like (not sure if it's 100% impossible, there's at least the double-buffer extension)...  This probably should be less of a problem on Windows, but I have no idea what's up there, and knowledge of best-practices doesn't travel quickly between Windows developers.  (Read: Even if it's doable and easy, do you expect the author of eRecipeDatabase 2000 to have known it?)

Judging from the "Digital Glitch" complaint, the problem with the public versions of Elate/Intent/DE was that the interface through the graphics layer of the host OS didn't offer any/enough opportunity for synchronization.  Plus, some people may have found other, weirder raster tricks on the Amiga to ensure graphics would flow smoothly, probably akin to the motion-predictive compensation now being studied to deal with the problem of animation on LCDs and other displays with 'full-on' pixel duty cycles.

(Yes, Virginia, interlace and CRT flicker can actually help render motion 'smoothly,' though the "slow phosphor" displays used to compensate for yesterday's scan rates negated most of the benefit.  (Assuming no jitter, wiggle, or whatever you'd call it between scans, interlace at, say, 200Hz should be as good as progressive scan at 100Hz, while actually having an advantage at rendering motion, ####uming the phosphor decay times were tweaked appropriately.  Faster, more rapidly decaying scan-flashes of light per-pixel would leave less of a 'trail' on both the display phosphor and your retina.))
Title: Re: SCANLINE EFFECTS...
Post by: iamaboringperson on September 25, 2003, 05:37:41 AM
@FaLLeNOnE

I know what you are trying to say by 'scanline' now...

You seem to misunderstand the term.
A television usually has 625(PAL), or 525(NTSC) scanlines which are interlaced.
Lets use PAL as an example of what happens:
PAL uses 25 frames per second. Which is made up of 2 'feilds' per frame. So there are 50 of these feilds per second.
Each feild is drawn by 312.5 lines down the screen(some are hidden - sortof).
Each one of these lines is a 'scanline' or 'raster (line)'.

What you seem to be talking about with the camera, is where the camera and TV are slightly out of sync.
The part that looks abit wobbly when you shake the camera from side to side, is not called a scanline, it moves up or down because both devices are not starting the drawing of the display at the exact same time. One starts to draw its display while the other is in the middle of its.
If the gap is moving up or down, then they are not at the exact same frequency, they both might be about 50Hz(interlaced) (or 60Hz for NTSC), but they are slightly different.

I hope this helps clarify things... :-)
Title: Re: SCANLINE EFFECTS...
Post by: Lando on September 25, 2003, 05:43:09 AM
How do these things work with LCD screens, where there is no beam drawing the display?

I mean, I wrote copper effects on my A4k which changed colour registers horizontally and vertically using waits and moves - the copper waits for the beam position to be equal or greater to the value you write in the copperlist then moves the data to the register - but on the LCD screen there is no beam, so how does it work out the "beam position"?
Title: Re: SCANLINE EFFECTS...
Post by: iamaboringperson on September 25, 2003, 05:51:30 AM
@Lando

ahhh! ;-)
Good question! :-)

The Amiga was certainly designed for CRT's!
The beam position is worked out using timers in the Amiga custom chips.
A sync signal is then sent to the monitor, and the monitor syncs to this signal(which baiscally says that when the signal says 'go', the monitor starts at the very top)
An LCD screen is usually made to work with this same signal. There is no real scan line, but the monitor still knows what to do with the sync signal. Just like it does with the other parts of the signal(or other signals).

Title: Re: SCANLINE EFFECTS...
Post by: Floid on September 25, 2003, 06:03:00 AM
The 'scanline' of course, is only a concept - it's the line drawn by the magnetically-swept *electron beam* that moves across the display, lighting one pixel at a time per frame.

What you see on poorly-synchronized videotape is an artifact or interference pattern of sorts, created by the camera's superhuman 'awareness' of the decay in brightness of the phosphors on the screen, modulated by the differences in both synchronization (when the camera starts sampling its frame, or field if it's interlaced) and rate (camera's at fractions of 60Hz, say, while display is at 75Hz) between the camera and the display.

If LCDs don't have the problem, it's for a few reasons -

-They're usually lit by fluorescent/cold-cathode sources, and today's run at high frequencies for energy efficiency, plus the bulbs have 'long' phosphors to even out any flicker.

-In turn, today's camcorders and the like have 'shutter speeds' or scanrates optimized not to turn the image to a flickery mess even in areas lit with crappy, oldschool fluorescents.  With a film camera, you'd just shoot at 1/30th of a second or slower and hope it worked out.  (I think a few consumer digicams even have modes specifically for taking snapshots of CRTs, and they might be able to do some sampling/synchronization tricks in the fraction of time between jabbing the shutter and capturing the image.)

-Some displays might use 'white' LED backlights run off DC anyway.

-TFT turns out to be pretty much full duty-cycle (I have to admit, I'm hazy on this), with a few caveats:

 -Full duty cycle turns out to suck for motion, see above.

 -Doing some research, apparently a lot of displays? 'dither' the signals for broader color resolution, so the current to a particular TFT subpixel (R,G, or B) may actually be running at some ridiculously high frequency, with duty-cycle varied to produce variations in brightness between 'off' and 'on.'  Now, LCD material itself is supposed to be slow to respond, so I don't know if this means the pixel is actually 'pulsing on and off' at like 10MHz (to pull a random number out of my ___) to dither your particular shade, or more likely, that, y'know, N% of the crystals are aligned one way or the other, and they might 'wobble' a bit but end up letting through a fairly even intensity for the duration of the scan.

 -What the heck happens when you run an LCD with an analog input?  Okay, I gather most *old* analog LCD monitors were dual-scan or other designs that somehow more closely approximated the CRT way of life that our analog video signals were invented for...  But now that just about every desktop display is a TFT, even those that only ship with analog VGA inputs, what's going on there?  I would've ####umed the control logic would be simple, and make each pixel run like a CRT pixel, *not* at a full duty cycle, but that would have some really weird implications for brightness (especially when you're designing the same panel to support digital DVI/LVDS input), so I suspect there's some ridiculously complex digital-sampling mojo going on here.  So the analog VGA signals are probably getting sampled into a digital framebuffer, and then displayed as a 'static' TFT image... though that framebuffer itself is RAM, and has to be 'scanned'... or even if it's SRAM, then the pixels will still *update* as the bits get flipped by the incoming analog signal, so there's still a scan rate at play, but without any ####ociated phosphor decays.  (Think of watching one of those electromechanical signboards they use on highways or buses flipping its elements around.)

It makes my head hurt.
Title: Re: SCANLINE EFFECTS...
Post by: Floid on September 25, 2003, 06:13:00 AM
I can't stop snickering about ####uums, now.  (Why is it always *four* characters?)
Title: Re: SCANLINE EFFECTS...
Post by: Karlos on September 25, 2003, 12:00:54 PM
I think what the guy is trying to say is that he gets smooth window movements and stuff when both his AGA and graphics card vertical refresh rates are set the same.

I know from my own experiments that some of the OS functions such as "WaitTOF()' don't seem to work as expected with my BVision card. WaitBOVP() seems to work OK, however.

No doubt some of these functions like WaitTOF() are tied to the original hardware and perhaps theres a sync bug in dragging windows etc. around because of this.
Title: Re: SCANLINE EFFECTS...
Post by: lempkee on September 25, 2003, 12:10:58 PM
karlos:broken lib ?  it should work on bvision!  , but on elbox p96 (mediator) WaitTOF() doesnt work.

anyway there seems to be quite a few (functions) to be non compatible atm ;(  , lets hope it changes.


anyway to them who talked about lcd! , so can i use a lcd (tv) with my amiga500/1200 or not?
for the 15kz rates or not? ,for games and demos...classic stuff...no scandoubler/flickerfixer etc...just plug and play.?

Title: Re: SCANLINE EFFECTS...
Post by: Karlos on September 25, 2003, 01:13:13 PM
Quote

lempkee wrote:
karlos:broken lib ?  it should work on bvision!  , but on elbox p96 (mediator) WaitTOF() doesnt work.


Search me guv'ner.. I am using CGX4.2 on OS3.5 bb2. Perhaps it is fixed, I did those tests ages ago with CGX4.1 and OS3.5 bb1...
Title: Re: SCANLINE EFFECTS...
Post by: iamaboringperson on September 26, 2003, 07:57:01 AM
Quote
I know from my own experiments that some of the OS functions such as "WaitTOF()' don't seem to work as expected with my BVision card. WaitBOVP() seems to work OK, however.

I was thinking about WaitTOF(), but what I'm wondering is to which monitor does it refer?
If a person is using a two(or more) monitor setup, or otherwise has multiple screens of different resolutions, which one does it check?
I can't see how it knows which monitor the program is using at current.
This is one function that needs some updating - I think.