True. It can come up easier because the fps has to only be higher than 60, but it still looks as ugly.
I much rather have a small delay than a distorted image. It's not like it's that big anyway. Maybe if you play CS or Quake for money it makes a difference.
its also that the image is on screen for longer and the pixels are on fo a whole 1/60 of a second instead of a quick flash of phosphor lighting up
also i really dont mind that much
add frame delay caused by the crappy lcds themself the 3 frame prerendering that at least nvidias do by default and youre rapidly approaching 100ms+ which if youre as thick as me when it comes to noticing oversteer is an issue
Are you sure you got it right? The prerender setting in nvidia drivers doesn't mean that the card renders frames with a 3 frame delay. What googling revelead that it does some kind of framerate smoothing with extra buffers or something like that. At least it's not anything silly like just adding a useless delay. There might be a delay, but 100ms+ is just you being silly
The input lag of crappy lcds is a real problem though. Fortunently the solution is easy, just don't buy crappy lcds
any buffer adds delays
and the only tfts without input delay are gaming tns which are even more rubbish than the ones with delay
so youll have a triple buffer for the vsynch which adds a 2 frame delay a tft which does another 2 and a prerendering buffer of length 3 which will give you roughly 100ms
1920*1080 here on my main monitor (19" LCD). I only use it on 1280*720 though, t'is a wee bit small for that huge resolution...60Hz @ all resolutions except 1920*1080, it goes down to 30hz interlaced. Just for the FPS war going on, it runs at 1280*720 locked to 100FPS and only dips with large grids
Secondary monitor (17" CRT) running 1152*864 @ 100Hz, but goes up to 1280*1024 @ 75Hz.
I think the benchmark should stay at 1024*768 though, as everybody can run that res to compare. I presume that using maths one can work out what it would run at at a higher resolution? Or does it not go proportionately?
+
the standard 125Hz @ 8ms USB lag (be it if you use a USB wheel) on top of the 5ms LCD latency, frame buff, frame drops dropping the CPU's response time and vsync's ultra 'uck' lag, 40 or 50ms can probably be added upon your ping time.
Yet, the USB latency can be tweaked down to 500Hz/2ms and 1000Hz/1ms with polling scripts.
Thank god the devs aren't from SimBin. Talk about device to screen latency.
It doesn't. There's a point at which the GPU becomes the bottleneck. It seems to depend on:
-'The amount of frames the cpu can throw at it'
-The card itself
-Resolution
-AA/AF
With the current (max) benchmark settings the CPU is always the bottleneck meaning an old vid card seems to be just as fast as a new one and this gives a false impression.
With the new bench I planned my avg fps went from 130 to 50'ish. This was with full gtr grid (20) on so long rev, 1280*960 and 4x/16x. Without AA/AF I gained 10fps so at least my old card (850XT) had some work to do this time
Why 4x you might ask...Well some Nvidias seem to jump from 4x to 8x and older ATIs do 6x max.
No it doesn't. Buffers are generally used to provide smoothness not to provide delays.
triple buffering with vsync does never add a two frame delay. Without vsync double buffering is used which works by having two planes which the image is set. The screen where the end result is shown and one backbuffer where the image is rendered, when the image is ready on the backbuffer it is then transferred to the screen. The reason why the image is not rendered straigth on the screen is because it would look generally ugly and it would be more difficult to code too. So with vsync the backbuffer waits for the monitor and transfers the image to the screen in sync with screen refreshes. If you would only use doublebuffering with vsync the gpu would have to wait for the monitor sync before it could start rendering the next frame (would cause a high delay). Which is why triple buffering was invented, triple buffering adds another backbuffer so that the gpu can start rendering the next frame on the another backbuffer when the another one is waiting for the screen sync. When the montor syncs the newest ready frame is then transferred to the screen. So the longest delay triple buffering + vsync can cause is the time between the monitor syncs (for example 60Hz monitor the max delay is 17ms). Without vsync of course there would be no delay, but then on the screen would be written half of one frame and half of another (which is called tearing). So you can't see anymore frames than the monitors refresh rate is, only parts.
The input lag comes from the monitors electronics, not from the panel type. On some monitors the electronics do some tricks to improve contrast or reduce ghosting or something like that. Which of course takes time which in turn causes the image to lag. So it doesn't really depend on what the panel type is, it depends on how much the monitor fiddles with the image before showing it.
On the uglyness argument. Yes, cheap panels are not very good in some parts. Contrast is still a issue for LCDs and ghosting to some extent (not very much on modern monitors). On the other hand LCDs have much better sharpness and image shape is perfect too since every pixel is always the right shape on a LCD. And the no flickering thing is great too.
I don't know how exactly that nvidia setting does or how it does it. But it seems a bit odd if they would have added 3 frame delay just to piss everyone off. It makes no sense just to have extra buffers to make the image lag. Maybe it tries to smooth the framerate or use the gpu more efficently. Anyway I am pretty sure it does not add 3 frame lag like you think it does.
In conclusion I can really see why would some one want a CRT, especially for gaming. In gaming sharpness or non-flickering is not so important but contrast, no ghosting and high refresh rate are. But on the other hand you can play of LCD just fine, modern LCDs are good enough and they are getting better all the time. It's really about what compromises you are willing to accept.
I'll have to check when I get back home. I know my card (Gainward 8600GT 1GB) can knock out a hefty resolution, but I think the limit for me is my monitor (19" widescreen). I run the game with everything maxed out and my framerate is capped at 100fps. I'll take the limit off and see what it'll run at too.
One question though, why the increase in resolution? If you want to make the game look prettier, surely it would be better to work on texture details or track/car detail in general? Not to say that LFS doesn't look good, but it does pale a bit in comparison to something like Forza Motorsport 2...
I just explained how double and triple buffering works in computers. And I have also coded a simple double buffering my self. So I think I know how buffers in computer graphics work. Feel free to post your explanation if you think I am wrong.
Yes. But do you read mine? I just said that TN panel itself does not cause the lag nor is input lag a specific problem with any panel type.
Ok. I can't really argue about that any longer because I don't know how it works.
i know how it works so thanks but no need for the explanation... how is it that you cant understand that adding a buffer of length 3 will add a 2 frame delay + whatever partial frames youll lose whenever your card can render faster than your monitors refeshrate?
so? ive implemented admitedly much simpler buffers myself in vhdl and am able to calculate systemic delays
for gods sake i never claimed it was a problem of the panel type what i said was the only monitors that you will find which have been built with REDUCING input lag in mind are gamer lcds all of which are tns and thus rubbish for lots of other reasons
So only when turning on 16xAA Q(uality) mode did the framerate start to drop, even at 1600x1200, then Gamma corrected AA and Transparrency AF have even more effect.
Because with triple buffering the the buffers length is not two. It is just one like with regular double buffering. The image is buffered in one of the buffers before it is sent to the screen, it does not go through both. The image does not go to buffer A then to B and then to screen. The image goes to buffer A OR buffer B then to the screen. The whole point of triple buffering is to reduce the delay and make the gpu not idle so much. In ideal situation the delay with triple buffering can be 0ms, if the refresh rate matches perfectly to the rendering speed of the gpu. Besides there is only one more buffer compared to regular double buffering, not two.
I haven't personally inspected each and every monitor there is in existance. What I was saying that there is no technical reason why the same input lag reducing stuff could not be applied to other panel types besides TN. And not all TN panels are built with gaming in mind. Infact some monitors which are TN suffer from a big input lag because the monitor tries to increase the contrast by applying some tricks to the image. Also there are many other panel types besides TN that are good for gaming too. If you want fast response and great picture quality you have to pay a bit more though.