I think you guys are getting to the point I was trying to make with the bit about HDR, why try to simulate something your hardware can't display when you can make the changes to your hardware to get the contrast to a more realistic level within the range you'r monitor can display.
I'm not saying you should kill your upper contrast, but I think it does look better with the contrast enhanced.
As for the clouds, I looked at them again, they do seem a bit over expossed, but only where the sun is, I compared that to the clouds around the sun today (hurt my eyes doing it) and it does look much closer to what I saw, posibly just a bit over, but very close.
So in the end, the the name of this thread should be something like, "more realistic contrast for realism junkies!" but I didn't think about it at the time, and the only thing I could relate it to was HDR, since I'd been working with it at the time. (and they did look pretty close at first)
In the end I think I learned a bit moree about my hardware, like the fact that my monitor isn't as messed up as I thaught.
And I discovered a realy cool photo trick to get Fake HDR that doesn't quite live up to the name but thats what they called it at school.
so far I have not seen well done HDR realtime rendering...it just looks so fake...half life 2 lost coast included, it just isn't there yet...it certainly doesn't make things seem more life like or real...now crysis (dx10) game coming soon--the hdr in that game looks much more realistic! but of course that game uses other dx10 effects to achieve more realism and has the benefit of using an entirely different lighting system more akin to 3d modeling, but in realtime...
Of course the eyes have a higher contrast ratio, and LOL at the bulb experiment!
Games have done that from day one.
Blooming is meant to emulate a really bright day where you're semi squinting at everything, and it really adds to games as it gives you similar difficulties in targetting people in high contrast scenes.
I'll disagree on this, as to me, the right-hand side of the picture I presented looks more natural to me.
[/quote]
The math behind HDR WILL flatten the contrast range of the source photos you pump in. That's the ENTIRE point of HDR. It compresses a massive range taken over multiple exposure levels, into what a monitor can display. So the bright areas from the most exposed pictures will be given the colour value of 255, and the darkest bits of the least exposed source i mages will be given a contrast of 0. That's why it looks flat. It does look unnatural and dim. That's perfect HDR, but people don't do that when they go for a "good" HDR pic. They tone it down and have an upper and lower threshold where everything above and below goes to 255 and 0 respectively. The higher the range of HDR you use, the flatter the image will look.
By definition, "none" HDR games, are perfect HDR scenes, as they are all restricted to a 0 - 255 range (depending on colour bit).
And HDR in games, like Lost Coast, look far better to me than the same engine with HDR off.
And the eye DOES have exposure restrictions. Have you never tried to look at a plane flying near the sun? You can't see it untill you put your hand in the way to just block out the sun, then you see the plane flying right near it. Also, bright lamps are used for interrogation so you can't accurately see the person behind it. Eyes are STILL affected by this, not as much as a cheap digicam of course, but the effect it still there!
it was just to prove to myself that its not all the iris and actually my retina which has the large dyn range
no they havent they present you a flat picture on a (contrast wise) flat monitor which is unable to cause any noteworthy alteration of your iris while playing
it might give you similar difficulites but its definately not what i see on a bright day and thats my point it looks nothing like i perceive the world around me
i can agree on the lower half of the picture but the upper half looks rubbish
i _strongly_ disagree on that one
no thats whats done to display a hdr image with current technology its not what hdr photography is about
what youre doing there is you take a hdr photo and turn it into a ldr photo for your monitor ... thats hdr photography but the result is _not_ hdr
in true hdr rendering and hdr picture formats pics arent saved or calculated in a 8-8-8 bit rgb space but in a colour space with a higher available contrast
thats the reason why nvidias 7 series of cards cant do aa and hdr at the same time ... they cant calculate aa in a float colourspace
what youre talking about is compression to get around over or underexposure on photos which is the same technique as in hdr photography but yields vastly different results
that makes it even worse and is what this whole thread is about actually but like i said thats not hdr
again my point if that pic was hdr it wouldnt be saved as a jpeg and your as well as my monitor would have a really hard time displaying it
actually they arent like i said earlier ... its the whole point of hdr rendering ... working in a non 32 bit colourspace
so if they would actually use the hdr capabilities the way theyre intended to be used they would have a very high contrast picture in graphics memory (not like they do now an over or underexposed low contrast one) which then would naturally have to be flattened to display it on your monitor ... of course this would look just like ldr rendering looked ever since quake 1 hit the shelves
hdr and (current) lcd and crt technology just dont go along with each other
personal preference and like i said i have to disagree
being an rc pilot ive actually done this a lot and was still able to control the plane without any problems as i could still see it
yeah but thats because the room behind the lamp is pitch black and while i havent tried this myself i would hazard a guess that the interrogated person is still able to make out the shapes of his interrogaters which would mean that the eyes have a far greater dynamic range than 32 bit rgb at any constand iris setting
I'm comparing images on a monitor, to images on a monitor.
I KNOW THE EYE HAS A HIGHER CONTRAST RATIO THAN A CAMERA.
I KNOW A MONITOR IS UNABLE TO DISPLAY CONTRAST OF WHAT IS VISIBLE IN THE WORLD.
BLOOM ISN'T HDR.
What I'm saying is. When you display a HDR photograph image ON A MONITOR, and when you display a HDR game ON A MONITOR. They are polar opposites.
The fact the images I displayed being JPGs makes no difference as when they were in RAW format in the program used they looked exactly the same due to the restrictions of a monitor.
Bloom does a good job at representing a really bright climate when you're not wearing shades. That's why shades were invented. So you don't walk around squinting.
Seeing "basic shapes" behind a lamp in an interrogation room prooves my point about the eye not being like HDR. In HDR you'd have a very long exposure photograph to pick up the things behind the lamp in perfect colour and contrast NOT relative to the lamp. This image, showing very clearly the things behind it, would then be merged with a very under exposed photo of the lamp, to form a HDR picture capturing more detail than the eye could do.
But that's beside the point I've been trying to make. I'm taking viewing images/games on a monitor as a given. I thought that was obvious.
And using your RC Plane example, lets say it flys right at the sun, yes you can still see it, or see the shape/silloutte of it, but you wont be able to read any writing or see detail on the side of it as easily as you could if it was against a more neutral background.
I'm not on about how easy it to localte or "see", I'm talking about how well it contrasts against itself against different backgrounds.
If you were to freeze the plane where it was, you could make a HDR photograph that shows the sun, sky and the plane in full detail, and relative colour on a monitor. This is how a NONE HDR game would render the scene on a monitor. Right?
Now, if you took the picture on a normal camera, and didn't perform any HDR processing- so a NONE HDR PHOTO, the plane would be sillouetted against the sun, and would look how a HDR GAME would render it. See?
Thus making my original point, in that HDR in games, and HDR in photography, both have the opposite goals when put onto the same media (monitor/print). I'm not talking about how the eyes perceived things, because NEITHER HDR when rendered/printed is how the eye percieves them.
as you hopefully can tell from my posts im fully aware of that but can we agree on that the goal of any hdr _should_ be to get the game look exactly like you perceive the real world ?
i didnt doubt that youre aware of those facts i simply said you draw the wrong conclusions from these
i know and i was aware that you were saying that
the point is that while they may be hdr in memory they arent on your monitor (and thus in the real world) so in effect they are in fact ldr images and doubly so if you save them as jpeg
my personal take on that is that if i squint my eyes i dont get any bloom or at least not nearly as much as i get with games ... what i get is a sort of ray effect emanating from the bright spots (which are very slightly blooming) and some random blurry bright shapes which are actually my eyelashes
yes but can we agree on that the correct way to display said photo would be with a monitor that achieves at least the same brightness as the lamp in the pic thus limiting the viewers perception of the low brighness areas in the picture to basic shapes ?
it is but my entire point is that current technology renders both hdr photos and hdr rendering bullshit and that lcds have made matters even worse than they were to begin with
yes
not quite as the entire plane would probably be outblommed
but still neither represents what you would see with your eyes
ideally your monitor would achieve the same brightness as the sun
now obviously this is either not achievable or would never pass safety restrictions
so the only correct way to display that scene would be with a monitor that can at least achieve the dynamic range of your retina while forcing the iris not to change shape (or by constantly measuring its momentary shape and adjust the brightness according to that) during the entire run of the game and then applying the current way hdr games work in a much much more sublte fashion
yes but to sum up what i said earlier hdr photography and true hdr rendering (ie not what you get in games) is the basis for displaying scenes in a way for your own eyes and brain to apply perception themself
what hdr games currently do (ie calibration to brightness at screen centre) should only ever be used to keep monitor brightness in a safe envelope and not to make games look like they are filmed with a cheap cameras with sluggish brightness calibration
GAAAAAAAAAAAAARGH!!!!!!! I don't care about EYES!!!!!!!!!!!!!!!!!!!!!
I don't care about monitor capabilities!!!
I'm comparing HDR photography versus "HDR" effects in game on the same medium, in this case a monitor!
WHY is this so hard for you to understand?
You could take a screenshot of a HDR scene, and a none HDR scene in a game. Then you could take a photograph using HDR, then one not using HDR. Then PRINT THEM OUT, onto PAPER, both using the same printer, and the NONE HDR game picture would look most similar to the HDR photo, and the HDR game picture would look most like the none HDR photo. That's all I'm trying to say. But you keep refusing to acknowledge it and somehow prove me wrong.
HDR isn't a set range. It can be any range you specify? Have you ever made a HDR photograph????
The HDR technology could allow you to combine a scene which contains a metal cutting laser head on into a lens, and near black writing on a black surface in the background in one picture, and both be viewable, and the picture would have an even contrast and look flat.
This is where you're going wrong. This is NOT the goal of HDR. You can take a HDR photograph that is completely unrealistic and puts even the best eye on this planet to shame by a long way. HDR isn't just a larger value for storing colour. It's not like a 64bit colour, it's a mathematical equation to average the contrast from multiple source images shot at different exposure levels. You can have a HDR 8bit photograph.
Before we continue: Have you ever created a HDR photograph?
But all this is irrelivant as I'm going to tear out my eyes in a minute
so do i but with a completely different take on them
i do too but unlike you its not acceptance which i arrive at
i fully understand your viewport and ive made that clear several times
ive acknowledged that years ago and ive made very clear that i do
wiki:
HDR images require a higher number of bits per color channel than traditional images, both because of the linear encoding and because they need to represent values from 10^−4 to 10^8 (the range of visible luminance values) or more.
i havent but im well aware of the process involved
why is it so hard for you to accept that using the techniques involved in making a hdr photograph dont automatically make the outcome hdr ?
taken from wiki:
The intention of HDRI is to accurately represent the wide range of intensity levels found in real scenes ranging from direct sunlight to the deepest shadows.
the operative words being "accurately represent"
that would not accurately represent the dynamic range of the real world
what youre talking about is compression and curve fiddling not hdri
again from wiki:
The values stored for HDR images are often linear, which means that they represent relative or absolute values of radiance or luminance (gamma 1.0).
what youre describing is not gamma 1.0
doesnt disprove my point ... like i said earlier ideally a hdr image system would be able to capture _and_ display the full dynamic range from pitch black to direct sunlight
actually at its core thats the whole idea behind hdr
no thats a technique developed later on to create hdr images with ldr devices its _not_ the original idea behind hdr
no you cant because it would
1) not be hdr
2) tone mapped
and probably
3) not gamma 1.0
I have a differing opinion on what a HDR image is to you then.
To me a HDR image is something that can display a higher range of contrast on screen or print, by combining multiple images of the same scene taken at different exposure levels, then normalising them to work within the confines of the media they are to be shown on.
You do realise, that you don't need special ultra bright photofilm/paper, and ultra dark chemicals/ink to develop/print a real life physical HDR photograph that you can hold in your hand right?? No matter if it's the result of merged exposure pictures, or one from a high quality camera, you still use the same colours and intensities when printed/developed/viewed!
What you're talking about is academic, and is how HDR data is stored, and file format. But if that's what a HDR image was, how would anyone ever see it? What use is a higher contrast file format if it can't be printed of viewed? The answer is, the contrast gets normalised to suit whatever media it's destined for, which is why the photograph of my decking is a HIGH DYNAMIC RANGE image, normalised for JPG, for screen/print. This is the part I care about. I can show you the original photo of the decking and I dare you to say they are even remotely similar in contrast.
And if you still don't understand, I'll happily show you some examples later of some very low quality HDR images, versus the originals.
well fine i can accept that but i believe that so far all definitions of hdr ive read relate to my take on hdr
to me as well the thing is though youre mixing up source and output dynamic range which create the whole mess of a discussion we had
therein lies the problems if the output media cinfies you to a low dyn range the output will be ldr regardless of the dyn range of your input
yes and no ... afaik print can reach a contrast of about 1:4000 which is to be considered ldr
accurate representation of the worlds dyn range relates to representation in memory as well as real world output
via hdr displays which are slowly reaching maturity
heres one thats pretty amazing: http://www.bit-tech.net/hardwa ... brightside_hdr_edr/1.html
accoring to this graph (yeah i know it looks suspiciously like marketing) on page 3 http://www.bit-tech.net/conten ... ide_hdr_edr/luminance.jpg
the display is able to display the retinas full dynamic range which means with tone mapping (like in current hdr games) applied to the dynamic output range of this monitor you will be able to achieve the full effect of hdr for human eyes if you are somehow able to ensure that the iris is constantly calibrated to the range from 0 to 5000 cd/m^2
the source image is hdr the putput you get from it isnt
not in input contrast captured but the output contrast will be the same
If film has a HDR, then technically any picture taken with any LDR camera has a HDR when printed then. In input no, but output yes, agreed?
On the same token, you could take a picture with a HDR device, or created in HDR software, and print it on none HDR media. This would then be a LDR picture on input no, but output yes?
The "LDR" picture would contain far more detail and far less clipping at both extremes therefore I think the initial input is more important than output.
I think I get what you're trying to say but this is a computer game forum, and it really is academic when you talk in the context of computer games which are displayed on regular computer monitors. The difference is, you can easily tell a HDRI picture from a normal one on standard none HDR hardware.
And pictures taken from HDR, when displayed on LDR hardware look MUCH better than pictures taken on LDR shown on the same LDR hardware. So in the context of computer gaming and computer imagery it's a very good thing and not to be sniffed at I think.
And all I wanted to was highlight the complete difference between HDRI and HDRR when dsiplayed on a monitor as some people seemed to get the wrong end of the stick
film as in film reel ? if so then yes (dunno the exact dynamic range of film reels but i suppose its rather large)
absolutely
again absolutely
well of course ... basically you have a higher amount of samples at the input which enables you to achieve a better output or at least opens up more possibilities for how designing your output
yes and no
the way all of this relates to games is that imho hdr as its currently done in games is utterly rubbish and such an application of hdr should only ever be used if youre trying to achieve a certain atmosphere in the game (like overexposed desert scenes to give the gamer/viewer the impression of heat)
or the other way around ... lfs looks perfectly fine and is a much better representation of what you see then any overexposed hdr game
likewise games that achieve the effect of a hdr input (ironically all those that dont use hdrr) look better on ldr screens
The trouble with HDR effects in a lot of games is they are trying to recreate the effects seen by a camera, rather than the effects seen by your eyes (things like lens flares - I know I've never seen a lens flare.).
Certain effects are good however. Proper motion blur will make a lot of games look better because your eyes don't see a series of still images, they see the whole thing moving. Same thing with bloom. You do see a small amount of bloom in the real world, just nothing like as pronounced as a lot of games these days. Half Life 2 Episode 1 had it pretty much spot-on.
I have to agree about HL2 E1
All my card supports is bloom, in CSS its way overdone, de_dust has to be the hardest on your eyes with bloom on. ( I don't play it much though, never realy liked counter strike)
But in Episode 1 it was amazing.
the problem with any blur effect (mostly motion and depth of flied) is that you cant know which object the gamer is following with his eyes or which one he wants to focus on
again the same as for hdrr in its current form applies
for dramatic effect ... great
for everything else ... nah
as per usual the only real solution would be something that truely replicates real life ie a 3d display that can fire some 500+ fps at you
I wouldent say 500+ fps, since the human eye realy can't tell the difference after 60+, or somewhere around that.
These people that go nuts after 200fps frame rates are just wasting there time since your monitor can't display that anyway, unless that is, you use an insane refresh rate.
We are along way from getting true 3d display, but we can somewhat fake it with special hardware.
never used them so I don't know how well they work, but I'v read
alot of positive reviews about them.
I might be purchasing a pair sometime when I have the money, after I get a license for S2 that is.
Edit: to get the glasses to work you do have to set your refresh rate higher than the standard 60hz
first of all yes it can see the differenc and for your eyes to blur the motion themself the framerate would have to be a lot higher than 60 ... youd need frams that are actually fluent not just ones you perceive as such
also i meant displays that actually can fire those 500 frames at you not just that the graphics card is able to calc them
actually with 2 projectors and polarization filters you can get the same 3d effect as in an imax ... and thats pretty much as real as it gets
Well technicaly it isn't 3d, because the objects still arn't actualy rendered in true 3d, its more of a perspective trick.
What they do is shut one eye as they display one perspective, and then shut the other as they show the second perspective. This just keeps swaping back and forward at the same rate as your monitor refreshes.
unless you have a refresh rate above 70hz it doesn't work well.
@Shotglass
if the FPS was higher it still won't blur on it's own, especialy at rates that exeed the percivable difference.
The screen is still stationary, so you won't get blur.
Motion blur has to be done by the rendering system to get it to work well
Look at the motion blur mod, it's very buggy but I'v tried it. It actualy blurs whats going by but didn't blur the hell out of the track ahead.
I had to play with the configuration file for a bit to get it to look good, but it realy does a good job.
Only problem is it crashes sometimes so as stated on the site, not recommended for regular use.
While you take a look around in your room each of your eyes senses flat images that are interpolated in the occipital lobe of your brain. Those images show the same thing from a slightly different angle.
correct, the brain "see's" the different images coming from each eye and gives you perspective through the magic of crainial logistics (term i just thoguht of to mean how the brain works:shrug
yes you will just like you do with moving object in the real world ... it doesnt matter if the object is physically moving or not (besides its physically moving in the same way objects in the real world move ... their "light signature which youre eyes pic up and your brain interprets changes its position ... doesnt really matter if it does that irl or on a monitor)
and the fps has to be way above percieveable diffenrence otherwise the frame wont change at all during one "frame" of your brain and thus nothing will be blurred
or other way round if the image on screen changes rapidly in the time it takes the brain to analyze your eyes output things will be blurred
nope ... actually if its done in the rendering pipe it will always look fake and tiled
thats really only useful for movies (for which it isnt actually useful due to the wa fraps captures but thats a whole nother story)