The 'scanline' of course, is only a concept - it's the line drawn by the magnetically-swept *electron beam* that moves across the display, lighting one pixel at a time per frame.
What you see on poorly-synchronized videotape is an artifact or interference pattern of sorts, created by the camera's superhuman 'awareness' of the decay in brightness of the phosphors on the screen, modulated by the differences in both synchronization (when the camera starts sampling its frame, or field if it's interlaced) and rate (camera's at fractions of 60Hz, say, while display is at 75Hz) between the camera and the display.
If LCDs don't have the problem, it's for a few reasons -
-They're usually lit by fluorescent/cold-cathode sources, and today's run at high frequencies for energy efficiency, plus the bulbs have 'long' phosphors to even out any flicker.
-In turn, today's camcorders and the like have 'shutter speeds' or scanrates optimized not to turn the image to a flickery mess even in areas lit with crappy, oldschool fluorescents. With a film camera, you'd just shoot at 1/30th of a second or slower and hope it worked out. (I think a few consumer digicams even have modes specifically for taking snapshots of CRTs, and they might be able to do some sampling/synchronization tricks in the fraction of time between jabbing the shutter and capturing the image.)
-Some displays might use 'white' LED backlights run off DC anyway.
-TFT turns out to be pretty much full duty-cycle (I have to admit, I'm hazy on this), with a few caveats:
-Full duty cycle turns out to suck for motion, see above.
-Doing some research, apparently a lot of displays? 'dither' the signals for broader color resolution, so the current to a particular TFT subpixel (R,G, or B) may actually be running at some ridiculously high frequency, with duty-cycle varied to produce variations in brightness between 'off' and 'on.' Now, LCD material itself is supposed to be slow to respond, so I don't know if this means the pixel is actually 'pulsing on and off' at like 10MHz (to pull a random number out of my ___) to dither your particular shade, or more likely, that, y'know, N% of the crystals are aligned one way or the other, and they might 'wobble' a bit but end up letting through a fairly even intensity for the duration of the scan.
-What the heck happens when you run an LCD with an analog input? Okay, I gather most *old* analog LCD monitors were dual-scan or other designs that somehow more closely approximated the CRT way of life that our analog video signals were invented for... But now that just about every desktop display is a TFT, even those that only ship with analog VGA inputs, what's going on there? I would've ####umed the control logic would be simple, and make each pixel run like a CRT pixel, *not* at a full duty cycle, but that would have some really weird implications for brightness (especially when you're designing the same panel to support digital DVI/LVDS input), so I suspect there's some ridiculously complex digital-sampling mojo going on here. So the analog VGA signals are probably getting sampled into a digital framebuffer, and then displayed as a 'static' TFT image... though that framebuffer itself is RAM, and has to be 'scanned'... or even if it's SRAM, then the pixels will still *update* as the bits get flipped by the incoming analog signal, so there's still a scan rate at play, but without any ####ociated phosphor decays. (Think of watching one of those electromechanical signboards they use on highways or buses flipping its elements around.)
It makes my head hurt.