I've always used Nvidia, so I wouldn't know. However, it is possible that people are using custom textures, or maybe a different setting for the time of day (most people race on the brightest setting).
I remember Tomshardware running an article about the differences between how ATI and Nvidia render effects. Though I'm fairly sure it was limited to bumb mapping and different shader crap no-one really needs.
My assumptions? Image editing, custom textures or psychology.
I changed from a 6800GS to a X1900XT-X, and other then extra eye candy filters (more AF and AA), and better preformance, I saw very little difference in the way of raw image quality.
The most probable reason is the fact that most people change their screenshots before posting.
I moved from an ATI 9800Pro to an nVidia 6800GS a couple of months ago. The nVidia image quality is significantly worse; using the same resolution, AA and AF settings, the nVidia card exhibits anisotropic filtering artefacts (texture smearing, texture shimmering, moire patterns) and anti-aliasing artefacts (poor smoothing, loss of fine detail) that did not happen with the ATI card. I eventually decided to dump the 6800GS and replace it with an ATI X850XT, which should arrive next week.
Reet ,try downloading this tool and manually set the 3D options.
Such as adaptive AA; mipmap and texture quality to high; high quality AF; both the AA and AF filtering optimizations,etc.
And see if it changes something, if it doesnt or u get much lower FPS u can always revert back to standard settings.
Also dont forget to play with the ingame sliders because highest values dont always give the nicest image.
Here are some comparison screenshots that I made several weeks ago when discussing ATI vs nVidia image quality on another board. The images were captured in rFactor since that was the game being discussed but the same image quality differences are seen in LFS too.
All images are captured at 1280x960x32 with 4xAA and 8xAF enabled in the drivers. The ATI card is a 9800Pro and the nVidia card is a 6800GS. The cards were swapped into the same system so all other components are identical. I spent several hours trying different nVidia drivers (currently running ForceWare 91.28) and tweaking settings to get the best possible image quality vs performance that I could.
The moiré pattern effect seen in the nVidia image causes texture shimmering in-game when moving. You can also see some texture smearing of the white lines in the nVidia image that isn't present in the ATI image.
The texture smearing seen above is due to having Negative LOD Bias set to Clamp rather than Allow in the drivers. Negative LOD bias sharpens the textures but also causes terrible texture shimmering when the scene is in motion. The texture shimmering is bad enough to be an eyesore so I leave my setting at Clamp as a compromise.
The thing is, with ATI there's no need to compromise: you can have clean lines and no shimmering at the same time.
The general consesus seems to be that ATI quality is a bit lower. Tom's Hardware Guide suggested that the image quality of a ATI card at 2x AA and 2x AF is about the same as no AA or AF on an NVidia.
I switched to an ATI card when I got one for free (a 9800 XT) and I get the same 'teaxture smearing' effect you describe. Sadly my old NVidia card died around the same time because I mishandled the fan on it so I'm stuck with the texture smearing until I buy a new GFX-card. That will be a NVidia again, so I can get good image quality again.
That is the direct opposite of my experience and the exact opposite of the image quality differences seen in the screenshots I posted above. I'll believe concrete proof and the evidence of my own eyes.
The following screenshot was taken using a GF 6800GT 128MB with the following settings:
Drivers
AA: 8x
AF: 16x
Image quality: High
Negative LOD Bias: Allow
LFS Resolution: 2048x1280
User LOD: 1.00
Dust LOD: 1.00
Mirror LOD: 0.10
Screen width affects LOD: Yes
Track LOD reduction: No
Mip bias: 0.00 (setting this lower results in the moire effect you described)
Enable dither: No
Enable mip filter: No
Half texture size: No
Hardware Vertex Shading: Yes
Z-Buffer Depth: 24-bit
Simple track: Off
Haze effect: Off
EDIT: I should note that setting Negative LOD Bias to "Clamp" does result in the texture smearing seen above, though why you would use that setting I have no idea. As mentioned, this screenshot was taken with it set to "Allow." Clearly the above screenshots were, shall we say, biased against Nvidia...
If you had bothered to read my post instead of leaping to the conclusion that I have some agenda against nVidia, you would have seen that I set the bias to Clamp because when it is set to Allow there is major texture shimmering when the scene is in motion. If you think I'm making that up, check the Negative LOD bias section here: http://www.nhancer.com/help/optimizations.php (nHancer is a tool for tweaking nVidia driver settings, by the way). Here's the relevant text:
"Negative LOD BIAS
The so called Level of Detail BIAS(LOD BIAS) controls which at which distance from the viewer the switch to lower resolution mip maps takes place (see here for more details about mip maps). The standard value of the LOD BIAS is 0.0 (zero). If you lower the LOD BIAS below zero, the mip map levels are moved farther away, resulting in seemingly sharper textures. But if the scene is moving, the textures start to shimmer.
Because of this, it's not a good idea to use a lower LOD BIAS to improve the sharpness of the image. It's better to use an Anisotropic Filter instead.
Some games force a negative LOD BIAS nevertheless. The result is heavy texture shimmering. To avoid this, the driver can clamp the LOD BIAS to zero. That means that the LOD BIAS can still be raised above zero, but it cannot set lower than zero."
Check out the road markings and rumble strips at South City and the track lines, rumble strips and main start-finish straight grandstands at Kyoto Ring for examples of bad texture shimmer in LFS. Other games such as GTR2, rFactor, GTL etc all exhibit very heavy texture shimmering.
So, with nVidia, I am forced to compromise between texture smearing or texture shimmering. That's like having to choose between a chocolate flavoured pile of crap and a vanilla flavoured pile of crap. With ATI, on the other hand, you are not forced into a compromise and can have crisp detailed textures without texture shimmer. Hence my choice to absorb the cost of the nVidia card, call it an object lesson and move back to ATI. If you're happy slurping down variously flavoured piles of crap, more power to you, just don't try to convince others that what you're eating tastes great and that they should have some too.
To those who are interested in the differences in image quality between ATI and nVidia cards, I suggest doing a google search on the keywords "image quality" nVidia ATI, do some research, see what the general concensus of opinion is for yourselves and don't rely on a handful of posts from any one forum.
Could it be that it differs from model to model, even from the same manufacturer too? I mean, I switched from a 7800GTX to two 7900GTX, and while I didn't take ant reference screenshots, it looks sharper to me.
I've always found my ATi to look better than my brothers nVidia. If ATi could actually produce reasonable drivers (now I can't even get an ATi driver without crapalyst feckup centre) then I'd be tempted to stay with ATi.
In order to provide a more apples-to-apples comparison, I've posted 3 screenshots at a lower resolution, all taken at the same point in the same replay. I did my best to find a spot where the texture shimmering was worst on my machine.
You will notice on the first screenshot, with Negative LOD Bias set to "Allow" and Mip Bias set to "0", there is no texture smearing, and only a minor ammount of texture shimmering on the guardrail on the right.
I ask that someone with an ATI card do the same. Please download the replay, stop it at exactly 24 seconds into the first lap, and take a screenshot at a resolution of 1280x1024 with Mip bias set to 0 and -4.
Okay, so here's the ATI screenshots with MIP level 0 and -4. Unfortunately i could not find a way to clamp the setting in any of the control panels (including ATI Tray Tools). If someone aware of the checkbox, please tell and i'll post the third shot.
My Catalyst version is 06.6 (Driver Packaging Version 8.263-060607a-033678C-ATI).
All settings (AF/AA) are set to most possible quality.
Edit: There must be something wrong, apparently the settings behave opposite to NVidia. However, as there is no way to clamp i cannot say for sure which level is actually used... :\
With the Negative LOD Bias set to "Clamp" in the Nvidia drivers, the Mip bias setting in LFS does nothing and you get that texture smearing. Since your Mip bias setting is working fine, don't worry about it.
BTW, I use an FOV of 90 degrees.
Anyway, I suppose this answers your question. No, Nvidia does not look better than ATI, but it's also no worse. If there are any differences, they are very difficult to spot.
I use Sli'd 7800GT's and I dont relate to the problems some of you describe for nVidia - HOWEVER I did get texture smearing when I tried some of Honey's track textures, really quite bad in places, so I think textures have a lot to do with it. Since going back to other textures i've not had any issues, and I still use some of Honey's darker textures so I think the problem described occurs is very specific circumstances.
Texture shimmering is not seen in static screenshots, it only becomes apparent when the scene is in motion. Don't take my word for it, read the section about Negative LOD Bias in the nHancer link I provided previously "...[b]ut if the scene is moving, the textures start to shimmer." So, all your test shows is that mipmap LOD bias settings affect texture sharpness. Yes, that's exactly what it's supposed to do.
The point that you seem to be taking pains to ignore is that the reason you have to adjust mipmap LOD bias settings with nVidia cards in the first place is that nVidia's anisotropic filtering results in texture smearing. Unfortunately, sharpening textures through negative mipmap LOD biases results in texture shimmer when the scene is in motion. So, you're forced to find a compromise between smearing and shimmer and that compromise varies from game to game and even track to track sometimes.
With ATI, you're not forced into making a compromise because their anisotropic filtering does not result in smeared textures. This means that you don't have to adjust mipmap LOD bias settings at all. You get crisp, detailed textures with no smearing and no shimmering at the same time.
From an image quality standpoint, the choice between ATI and nVidia is a no-brainer.
I encourage anyone considering a new video card to do some careful research into the differences between ATI and nVidia image quality before commiting to any card or any brand. Unless you're looking to buy one of these (http://pcgames.gwn.com/news/story.php/id/10042/) in which case, nVidia all the way baby!