First, from your article regarding framedrops when running 75Hz.
The article isn't really scientific in their testing methods and doesn't even write out which monitors they've tested the 60/75Hz on.
I would gladly test out my monitors in the same way as they did if it's scientific enough. I have a fast system camera. Just give me something to test with.
I've read the part of triple buffering several times now and it doesn't have anything that say it would drop down to half (a third or a quarter etc) of the given framerate. The only thing you can read out is that triple buffering solves the problem, let me quoute it for the sake;
There is a technique called triple-buffering that solves this VSync problem. Lets go back to our 50FPS, 75Hz example. Frame 1 is in the frame buffer, and 2/3 of frame 2 are drawn in the back buffer. The refresh happens and frame 1 is grabbed for the first time. The last third of frame 2 are drawn in the back buffer, and the first third of frame 3 is drawn in the second back buffer (hence the term triple-buffering). The refresh happens, frame 1 is grabbed for the second time, and frame 2 is copied into the frame buffer and the first part of frame 3 into the back buffer. The last 2/3 of frame 3 are drawn in the back buffer, the refresh happens, frame 2 is grabbed for the first time, and frame 3 is copied to the frame buffer. The process starts over. This time we still got 2 frames, but in only 3 refresh cycles. That's 2/3 of the refresh rate, which is 50FPS, exactly what we would have gotten without it. Triple-buffering essentially gives the video card someplace to keep doing work while it waits to transfer the back buffer to the frame buffer, so it doesn't have to waste time. Unfortunately, triple-buffering isn't available in every game, and in fact it isn't too common. It also can cost a little performance to utilize, as it requires extra VRAM for the buffers, and time spent copying all of them around. However, triple-buffered VSync really is the key to the best experience as you eliminate tearing without the downsides of normal VSync
If you read the text about the monitor you showed, you can clearly see that this has nothing to do with the input refresh rate or normal cases, it's just something this monitor does, almost as the 50hz crt with 100hz refresh rate.
Let's quote from http://www.behardware.com/articles/646-2/benq-fp241wz-1rst-lcd-with-screening.html
The images received at 60 Hz are duplicated twice for a refreshing frequency of 120 Hz instead of 60 Hz.
Now, this is a fully normal TFT monitor in the way that it is taking 60Hz input and it aint showing any more information on the screen than any other. The features don't have anything to do with what we're discussing here. It's just a visual feature.
At 0 hz, the GFX card wouldn't transmit a single picture, so you'll see nothing. At 60hz with a non moving picture, you will not notice any change to 200hz (provided that your screen would accept that refresh and not display out of sync). This is because a LCD doesn't need to constantly draw the picture (CRT's redraw the picture at given refresh rate per second). A LCD, even though it scans progressively, it doesn't change the pixels state as long as there is no change in shade.
The 0Hz was supposed to be a joke that we could use any still picture with backlight and you can't see any difference from xHz refresh rate.
Regarding redraw and flicker in the CRT I've already stated that it isn't what I'm trying to discuss here, just what information is shown on your screen at which time.
Why did you mention that LCD scans progressively, CRT does too. (well not TV's and really old computer monitors)