Isn't AA just one of things that are switched on (via game menus) with the texture detail settings? I know when i switch something like "32 bit textures" or "Detailed textures" or "High graphics" then it usually results in the pixels being blurred beyond recognition for that uber "smooth" effect.
If the game was designed to run AA, surely the devs would've put the option in the menu?
Either way, it's an option that can be turned on or off at peoples own discretion. Some like it, some don't. Difference is what makes the world interesting
What you are talking about in your posts.It seems you have no idea how it works.
When devs of everygame doesnt put in menu to turn on/off AA or AF it doesnt mean at all nothing.
Its the graphic card who handle AA and AF and game 3D engine has nothing to do with it at all.
do you know what is the difference between 16bit and 32bit???
It mean you can use more colours so the more grahic card has to proceed.In past with graphic cards like NVIDIA TNT or Vodoo2/3 there was difference between 16/32bit performance but nowdays there is 0percent.
Detailed textures usually mean that its used hi-res version of the textures.
I would appreciate it if you took a less condescending tone towards me in future.
Yes i do understand the difference between 16 and 32 bit. 16 bit being a little over 32,000 colours per image. 32 bit images not actually being 32 bit. They are 24 bit with an addition 8 bits allocated for the Alpha channel. Alpha channel being an information storage of special effects of graphics such as transparency and overlapping pixels... that is what i understood of it anyway. Never really looked that far into it, so correct me if i'm wrong.
You saying that there is no difference in games between 16 bit and 32 bit graphics is also another assumption that i have no idea what i am talking about, and that i am imagining things when i see a difference between 16 bit and 32 bit graphics inside a game, which indeed i do.
The FACT that there have been games released in the past which have the option to turn Anti Aliaising on or off further compliments my previous post saying that if the game was designed to be run with AA then it should be in the menu. I know Monster Truck Madness 2 had the option in the menu as it's one of the first games i put my 3DFX 12mb Voodoo 2 card through it's paces with. There are more games with this option that i have come across in the past, but remembering back that far is a hard task!
You saying that devs not putting the option in the menu means nothing at all is a little absurd! So, the developers only want people who know how to play with their video card settings 6 menus deep in the graphics properties of their video card to be able to have "better" graphics? If that's what you are implying, it would seem a little strange that the developers of video games would not want to offer a typical user the best possible graphics that the game could produce, regardless of the option being a 3rd party rendering done by the graphics card itself. So why would they not offer a check box which turns on AA? I guess it's a question only a developer could answer...
Here's a bit of interesting info i found on an Intel website about AA:
Perhaps the reason why i don't often use AA is because i always run the game at it's maximum resolution. In the case of LFS, 1280x1024.
I think it more easy to leave it off from the game menus. It would reguire more work playing around with all the harwares software and between LFS and there is lots of those softas and so on, I dont really know but Im just making consumption. And quality setting arent that hard to change really. I'll post a pic, takes me 3 seconds to change mine. But I really dont get how someone can doesnt want smooth lines, and likes to play without AA. You have one twisted mind there illepall
It's a little more difficult to setup for my card. (see screenshot)
You have to create a profile for the game, because LFS doesn't create one itself. Then you have to assign individual functions such as AA and what level you want it to function at.
I'm perfectly capable of such a task, but my concern was more towards the weekend gamer who doesn't delve into such things. Maybe i'm wrong and almost every gamer plays with their video card settings in windows, but i just didn't think so.
Ahh well. Either way, some people like it, some people don't. Shall we continue the thread on it's original, awesome and righteous path?
Hi,
I didnt want a flame and I have to agree it was a bit harsh from me :-( .Really sorry.I was just more then confused you were talking about it the way it doesnt work.
The DEVs put in game menus often a option to turn AA and AF because there are plenty people who doesnt know they cant turn it it on/off in drivers or where to even find it.I have build so many PC for my friends or whoever and it was a problem.I set for them in AA+AF in drivers which handle the particalur games fine but they get later horrible FPS in another game.It was really hard to make them understanding/show them how to play with setting for each game(especially AA+AF).
Thats why DEVs put it in game menus because its more easier for less experienced people to turn it off/on in game rather in drivers and set-up the profiles.Most people simply doesnt know it or even understand it.Yeh I know it might look confusing for tech people but that its how it really works.
You can find around in the forum people asking how you can get such a nice graphic like on official screenshots.They saing we turned on all option in LFS and we still dont get such a nice result.So many times people epxlained here how to get in drivers to AA and AF.Then people coming back and saying why it works now so slow.So you had to explain them again and again how it really works.
The AA+AF has really nothing to do with game 3D ingine itself.Really bealive me.When its possible to set-up in in game it mean that you can acces to graphic drivers thru game and set the AA+AF(thats equal to "Aplication controlled option in NVIDIA drivers").
There is really not all all game AA+AF.ITs always graphic card who perform it thru drivers.Hope its more clear now.
I have to also agree with the Intel quote.Nowdays some graphic cards can perform like 2-4xAA without performance loos.Well if you dont count like 1-3FPs lost from 70FPS.
I can set for LFS like 8xAA+16AF without any FPS lost.
And to be honest.Intel talking there basically about integrated graphic solution in chipset which is really the worst graphic card you can have nowdays.AA simply work also with DirectX games as well and not only woth OpengGL.There are more and more OpenGL based titles and there wil be probably none in Windows Vista using DX10 as there will be only software support for this as Microsoft mentioned.
The intel quote worked maybe very good a few years ago but not now with every game.
i get a framrate loss due to AA and AF (and changing the quality setting) but it is a dramatic loss (only 10-15fps from about 40fps) using 8x AA and 16x AF, doesn't really both me too much, but if it was my old gfx card i would get 10 fps i get nice screens though
It always depend on the power of graphic card how much AA and AF you can set.
From somethung like GF6600 and up ther is hardly any FPS lost with like 4xAA+8xAF.
I am not forcing anyone to use AA+AF.They must know themself what their hardware can handle or not.
I was bored so I figured I'd give drifting a shot. Using the best drifting setup in the world (Bob Smiths Road Going [With passangers removed]), I went for a spin around blackwood (and who knew you could only get one lap out of your tyres before they have next to no grip?).
What did we learn from this? drifting is too easy, though strangly fun (more so in the esses). That said I don't think I'll give up my day job of being the slowest racer on the track
And now, a quick poser shot:
Someone remind me to buy a new monitor, limited to 1024x768 sucks
There is no way the right side has AF, it loses clarity too soon. If it does, then either your card sucks at rendering AF, or you have it stupidly low. Though I do agree, AA makes my eyes hurt and usally gives me a headache (and in FPS games has been known to make me dizzy), but it looks good for glory shots and having the graphics card I do, I'm too lazy to turn it off as I see no difference in preformance either way.
I never turned AF off on any of my cards, it was always maxed, and I lost maybe 5FPS at a push (bar my MX440, which lost about 10-15 depending on the game).