True. Infinity is impossible for humans to understand because in reality infinity does not exist. There is no factual thing you can bind the mathematical term of infinity.
It works with any number not just with 1. So any number can be represented by having a finite decimal and with an infite recurrance. For example 2.999... = 3 and 42.2332999... = 42.2333.
Mathematics isn't flawed, it all adds up Everything in mathematics is proven with other mathematical equations.
Because with triple buffering the the buffers length is not two. It is just one like with regular double buffering. The image is buffered in one of the buffers before it is sent to the screen, it does not go through both. The image does not go to buffer A then to B and then to screen. The image goes to buffer A OR buffer B then to the screen. The whole point of triple buffering is to reduce the delay and make the gpu not idle so much. In ideal situation the delay with triple buffering can be 0ms, if the refresh rate matches perfectly to the rendering speed of the gpu. Besides there is only one more buffer compared to regular double buffering, not two.
I haven't personally inspected each and every monitor there is in existance. What I was saying that there is no technical reason why the same input lag reducing stuff could not be applied to other panel types besides TN. And not all TN panels are built with gaming in mind. Infact some monitors which are TN suffer from a big input lag because the monitor tries to increase the contrast by applying some tricks to the image. Also there are many other panel types besides TN that are good for gaming too. If you want fast response and great picture quality you have to pay a bit more though.
I just explained how double and triple buffering works in computers. And I have also coded a simple double buffering my self. So I think I know how buffers in computer graphics work. Feel free to post your explanation if you think I am wrong.
Yes. But do you read mine? I just said that TN panel itself does not cause the lag nor is input lag a specific problem with any panel type.
Ok. I can't really argue about that any longer because I don't know how it works.
No it doesn't. Buffers are generally used to provide smoothness not to provide delays.
triple buffering with vsync does never add a two frame delay. Without vsync double buffering is used which works by having two planes which the image is set. The screen where the end result is shown and one backbuffer where the image is rendered, when the image is ready on the backbuffer it is then transferred to the screen. The reason why the image is not rendered straigth on the screen is because it would look generally ugly and it would be more difficult to code too. So with vsync the backbuffer waits for the monitor and transfers the image to the screen in sync with screen refreshes. If you would only use doublebuffering with vsync the gpu would have to wait for the monitor sync before it could start rendering the next frame (would cause a high delay). Which is why triple buffering was invented, triple buffering adds another backbuffer so that the gpu can start rendering the next frame on the another backbuffer when the another one is waiting for the screen sync. When the montor syncs the newest ready frame is then transferred to the screen. So the longest delay triple buffering + vsync can cause is the time between the monitor syncs (for example 60Hz monitor the max delay is 17ms). Without vsync of course there would be no delay, but then on the screen would be written half of one frame and half of another (which is called tearing). So you can't see anymore frames than the monitors refresh rate is, only parts.
The input lag comes from the monitors electronics, not from the panel type. On some monitors the electronics do some tricks to improve contrast or reduce ghosting or something like that. Which of course takes time which in turn causes the image to lag. So it doesn't really depend on what the panel type is, it depends on how much the monitor fiddles with the image before showing it.
On the uglyness argument. Yes, cheap panels are not very good in some parts. Contrast is still a issue for LCDs and ghosting to some extent (not very much on modern monitors). On the other hand LCDs have much better sharpness and image shape is perfect too since every pixel is always the right shape on a LCD. And the no flickering thing is great too.
I don't know how exactly that nvidia setting does or how it does it. But it seems a bit odd if they would have added 3 frame delay just to piss everyone off. It makes no sense just to have extra buffers to make the image lag. Maybe it tries to smooth the framerate or use the gpu more efficently. Anyway I am pretty sure it does not add 3 frame lag like you think it does.
In conclusion I can really see why would some one want a CRT, especially for gaming. In gaming sharpness or non-flickering is not so important but contrast, no ghosting and high refresh rate are. But on the other hand you can play of LCD just fine, modern LCDs are good enough and they are getting better all the time. It's really about what compromises you are willing to accept.
Are you sure you got it right? The prerender setting in nvidia drivers doesn't mean that the card renders frames with a 3 frame delay. What googling revelead that it does some kind of framerate smoothing with extra buffers or something like that. At least it's not anything silly like just adding a useless delay. There might be a delay, but 100ms+ is just you being silly
The input lag of crappy lcds is a real problem though. Fortunently the solution is easy, just don't buy crappy lcds
True. It can come up easier because the fps has to only be higher than 60, but it still looks as ugly.
I much rather have a small delay than a distorted image. It's not like it's that big anyway. Maybe if you play CS or Quake for money it makes a difference.
The thing with Veyron is that it's not only fast, it's also a hugely comfortable luxury cruiser and it's very easy to drive (fast or slow).
The 9ff on the other hand is basically a race car for the road. Sure it is fast, but it would probably kill your back if you drove it more than 100 meters in one session and you would just look silly when you would climb over the roll bars covered in sweat when you arrived at the most prestigious hotel in paris.
It's better to set the degrees to 900 and then set wheel compensation to 1 in LFS. Then you will have the same degrees of steering than the car in LFS has. That way you can jump from UF1 to MRT and the steering degrees will be correct without any need to fiddle with any setting.
The only one I can agree with 100% is the one in BL1. It's the only one where you really have to drive there on purpose to hit the penalty area. All the others seem like you could drive on them by mistake or if someone nudges you. Especially on FE tracks it is really easy to hit the DT spots for reasons out of your own control.
But reality is not black and white like that. Sometimes you have to drive on the grass to avoid crashed cars or if you make a mistake and end up going on the grass or hitting the wall. You don't get penalized for things like that on real races either...
This system is supposed to prevent blatant cutting, not make having races annoying. HLVC only works on hot laps because on hot laps there are no other cars on the track and the point of it is only to drive a fastest possible lap. On a hot lap if you drive on the grass, you either made a mistake or you are cutting so penalty is always on place, but on racing there are more reasons why you could end up there and it doesn't mean your race is over.