Zay, you're mistaking 2 things together
CPU's are 32 and 64 bits nowadays
Graphics resolution is as Flame said above
Hehe (I am not daring say to you "fail")
Cheers Mate
It does? Look at the glovebox in the first example, there is a weird brown stripe stretching over it with 16bit color depth. The whole dashboard looks pretty ugly in example B in 16 bit if you ask me...
utter bull
you need a higher bitdepth than we currently (actually 30 bits would be close to fine if 32 bit culours were actually 32 bit instead of 24) to get every nuance of the spectrum a human can distinguish
more importantly however you need a much higher bitdepth to get accurate mappings from one colourspace into another when working in a calibrated environment (or if like 99% of all people i know you want to rip your eyes out after using windows in adobergb for more than 10 minutes)
The green tint comes from the fact that there are 6 bits of greens and only 5 bits of red and blue.
With 32 bits all three have 8 bits. And the last 8 bits are alpha layers if you were wondering why 8+8+8 doesn't add up to 32.
To CPU mode.
In case of Windows operating systems, main advantage is PatchGuard (also known as Kernel Patch Protection), which pretty much renders all rootkits useless.
32 bit = 8 bit red, 8 bit red, 8 bit blue, 8 bit alpha. Because current monitors do not support alpha channel, they really are 24-bit, but people are used to calling them '32-bit'.
16-bit is 5 bit red, 6 bit green, 5 bit blue, that's why e.g. smoke is green'ish in 16-bit mode.
^ snap, I guess I will read all posts before replying