Sounds like you're using a digital monitor - since it's digital it samples the analogue signal and outputs what's been sampled. If the sampling rate is too low (only half of pixel frequency) every other pixel gets lost. As it seems, your monitor has no idea that the Amiga is outputting such a high pixel rate.
Look at the way a video output or (analogue) monitor works: the video hardware just outputs a stream of pixels; for Lores they're 140 ns apart (long), for Hires 70 ns and for SuperHires just 35 ns. In the monitor these pixels modulate an electron beam that scans the screen, moving from left to right. After some time - when the desired line length is reached - a horizontal sync pulse tells the monitor to quit scanning horizontally but move to the next line directly underneath and rewind to the left edge. When the desired frame height is reached a vertical sync pulse tells the monitor to rewind the scanning beam to the top left corner to start the next frame.
A fixed frequency monitor has a constant scan rate to make the beam cover the whole screen during a horizontal or a full vertical sweep. A multisync monitor has to track the vertical and horizontal sync rates and adjusts the sweep rates accordingly.
A digital (flat) monitor is more complex. It samples (A/D converts) the analogue signal into a buffer. To make sure none of the pixels gets lost it's required to pretty exactly match the sampling rate to the pixel rate by using an edge detection algorithm. Note that this exactly the way a flickerfixer works - in theory all flat screen integrate one, but in real life they're usually limited by the extent they can adapt to the signal.
The buffered image (hopefully resembling the original signal) is then output to the display matrix. On LCD screens there's no sweeping beam but rather a continuous memory matrix that you look at, thus the susceptibility to jitter and jerking.