I use 1280x1024, but I never allow websites to stretch over more than 1024x768. And that ony goes for sites with loads of content where I need the space. And I only do that seldomly. The optimal size for a website displayed on a multitasking system is 800x600.
That's no shame. 17" and 19" (and other sizes too) LCD monitors actually are 5:4 like 1280x1024, so the pixels actually are square and the picture is not distorted.
My 4 year old laptop is a 15" LCD display and reaches 1600 x 1200 that I use all the time. Basically, if you get an LCD display that can't go any higher than that, and at that size, then you didn't pay enough for a good display
Mmmm, well for medium-performance systems, it can help, but only by a little bit. I noticed a lot of the older Nvidia cards have better performance when running in 16-bit on games, and if you try and use 32-bit... the game will be slower. But ever since I've had an ATI card, 32-bit doesn't make a difference .
I run my 4 year old 17' Packard Bell monitor at 1152x864x32. In 1024x768 mode everthing is allready to big for me . And 1280x960 is no option because its at 60hz, i get major headaches after half an hour :Eyecrazy: Also i run my games at the same resolution so when i hit "quit" or hit a game on my desktop it immediatly jumps back to windows.
I have now 3 15" monitors, one of them is TFT others are CRT. So 3x 1024x768 (or 3072x768)for me. Just got these today, now I am trying to learn how to use all this space
Well not quite, because I own a GeForce 2 GTS Pro 64mb in my old desktop sitting here (now used for storage) and I remember that when playing games like Halflife and other earlier titles, running in 16-bit would increase your FPS and in return get smoother gameplay, at the cost of nasty looking fades, but oh well. Other games I used to play, like some motorcycle games, which had just come out in the Geforce 3 era, my Geforce 2, as well as my 2 friends who had similar spec systems could not run the game at 32 bit (Motocross Mania was the game I think).
But like I said, now we don't have a problem with it really, new cards can run 32-bit no problems.
When 32bit is turned instead of 16bit on on my onboard chip, FPS is cut in half when playing games the worst example had to be the old 99' Unreal Tournament, the game was unplayable at 320x200x32, while 1024x768x16 is silky smooth.
Thats native res for 17" LCD's and looks best. Also spplies to some 19" LCD's, but some are higher.
Native res means that anything below that will be distorted and blocky, since the screen has to emulate that res on its native res - being 1280x1024. Always run LCD's in their native resolution for best results. Sometimes its hard in game cos some peoples graphics cards cant run that high, but if you can afford an LCD, surely you can afford a 6800 or x800
The standard has been bumped up now and is 1600 x 1200 or higher. But like I mentioned before, even my 15" UXGA+ LCD on my laptop has a resolution of 1600 x 1200, and that is its native res according to Dell. When I ordered that screen, it was very expensive.
The native res you get is mostly dependant on the quality of the screen, when they rate it (XGA, UXGA, UXGA+, SXGA, etc)...