This should not be done at all, no hardcoding please, that’s an erroneous behaviour and can lead to problems when dealing with footage for example.
At least give a checkbox to enable/disable the “pretty but incorrect” behaviour, and leave it enabled by default, but sun/sky should behave as they should behave.
In all render engines you have to adjust exposure when you work with physical sun/sky, we don’t need to avoid that thing, but as I said, at least give a checkbox to disable that.
No please, adding extra mess in UI is not going to help anyone, if people want to control it then they just have to take the mouse and slide that Strength of the Background Node, that’s it.
I agree with you, there is no such thing called hardcoded that should ever be even remotely considered in 2020.
In the first minute there are the photography fundamentals, and then he is showing all the Exposure tools that UE does have, very interesting.
If only there was already an Auto Exposure feature implemented in Blender i’m sure this whole discussion wouldn’t exist.
Guess what i did an add-on about that too, https://www.youtube.com/watch?v=lhY9E0bHwFE and it seems that the Photographer creator based code upon my add-on.
But i’m tired of all these “physical” add-ons, it’s 2020 and we can’t rely on add-ons for these basics stuff.
Maybe that is what needs to be done. And it can be done totally transparently if the Sky texture divides by 100 knowing that it will be multiplied by 100 by the Background shader.
I don’t think that’s elegant, but maybe there is a multiplier like that in UE4 somewhere (either hardcoded or just a default value somewhere). I’m not immediately sure how else to explain the result from my UE4 test.
Sure, it can be a parameter, that’s what I suggested above.
Did you actually check this, and in which renderers? And did you check how much you need to adjust, making things 100x darker or a smaller amount?
Auto exposure doesn’t matter, it’s about relatively intensities between Sky texture, emission and HDRIs.
Anyway, my point was just that not having 1 to 1 pixel mapping out of the box is not an issue. So it won’t interfere with something like motion graphics. Also, I was mainly talking about having some sort of physical camera support in blender, rather than the sky exposure settings proposal. Just having a camera object where I can set up ISO, Fstop and shutter affecting each other on one place.
It tested with auto and manual exposure, and with physical units, doesn’t make a difference to the point I’m making. My concern is about relatively intensity between Sky texture, emission and HDRIs.
And the point is, the Sky texture should not be 100x brighter than the others. Maybe the solution is to make everything else 100x brighter to match (hidden to the user or as a parameter). But changing just one of them is not going to work well, nor do I believe it is some standard convention among renderers that they work like this.
I’m all for adding a physical camera. But again, it must be done completely and not halfway for one mode of one texture.
I see your point of doing everything arbitrarily chosen, but in ideal 3d program physics is what is and i can’t accept a “let’s hack the value internally to “look good” with everything else” as a choice.
I think making everything 100x brighter and adjusting the exposure value accordingly could help.
Another thing that could be done is something like camera presets (much like on a real camera) for specific lighting conditions. If there is a preset that’s called “sunny day” and one called “indoors” or something (the details and granularity could be debated), that might be pretty discoverable. It could also give a point of comparison and starting point for people to learn all about the various parameters that go into setting up a camera.
And people could easily add their own if they wish. - Could be nice for, say, productions that switch between different locations requiring wildly different settings, so each location could have its own preset attached.
Another possibility, which would probably be somewhat more technically involved, but a lesser GUI change, is to add some sort of auto exposure feature that could be on by default. For most people in most situations that’s probably desirable, at least while messing around. And if you want to mess with the values, you can always uncheck that box.
Either way, I think the reason to be as physically based as possible is to accommodate all cases where data actually is available to inform your workflow. It’s not too hard to fiddle with it all and go wild from there (say, via nodes). But the defaults and basics ought to make physical sense. That’s just gonna be the most natural workflow.
Well, I worked over the years with mental ray, iRay, Corona Render (and some others, but these are the main ones before we went full-blender) and in all of them if you didn’t do an exposure adjustment the image was totally blown out.
I think that’s perfectly normal, if you want to deal with realistic values you should expect this, if you don’t have this, how will you do a daylight that goes to a let evening and then you turn on the interior lights? It will be a mess of powers for the lights.
The same goes for and outside/inside shot, or the other way around, it’s expected to work with exposure.
The “weird” thing in Blender is that the exposure is controlled over Color management instead of being a per-camera setting, so people is not totally familiar with that, and the same goes for the exposure value, there are no native “physical camera” settings for aperture, speed and ISO, which are pretty standard in other engines.
But yes, getting a blown up image is kind of normal, and you should learn to deal with exposure, it’s a minimum for anyone wanting to do som rendering
Regarding HDRI’s, it all depends if the HDRI is correctly exposed and prepared, many of them are not and their light is pretty poor, getting one that is actually correct it’s not easy I’m afraid, I know the ones from Adan Martin, the latest ones, but they are not cheap, however they are correctly made and prepared, and if you use them with 0 EV you get a blown up image.
Personally, I think the compromise of having the default solar altitude be near-dusk is actually a pretty good solution. Noonday sun on a cloudless day is pretty ugly lighting, ideally you don’t shoot under that kind of lighting unless its needed by the shot for a particular reason (story says the scene takes place at midday, wanting to give the feel of hot weather, etc).
Even if the “artistic” brightness is chosen, the default elevation should not be 90°, that only occurs in the tropics and only then for a few minutes per day for part of the year. The formula for maximum solar altitude at your location is 90-latitude+23.5. For large parts of the globe, the sun doesn’t even climb above 45° for much of the year. For example, Amsterdam, where the Blender Foundation is located, the sun hits a max altitude of ~60° today, and it’s mid-july! On Christmas day it maxes out at only 15°! My experience has been that a lot of artists who don’t have some experience with either outdoor photography or astronomy greatly overestimate how high a typical sun elevation is. Even if the shader will be set to mid-day by default, that should probably be something like 35°, not 90°
Also, when using the sun for light streaming in the windows in interior scenes, the physical intensity option actually does give the expected default results. In that case, you want sun streaming in a bright window as a few highlights. It’s only when the subject is outside under direct sunlight that you need to fix exposure in the first place.
Finally, the higher values may well be desirable if outputting for HDR10/Dolby Vision, where the brighter look under the sun may be wanted to better give the impact of broad daylight. (if they want to grade down so a gray card under direct sunlight lands around 100nits, that’s fine, but that’s a grading decision and not one the shader should do by default)
There is no such thing as “correct” with regard to HDRI ratios; the meaning is derived from the ratios between values, and there is no convention to use absolute units.
As a result, the values need to be scaled to match the ground truth of a given shot, and that too, varies.
I prefer physically accurate, but I’d love tools like this to have presets to make things look good in different use cases. And even to be able to simulate other stars and planets in the future. Making it simple for the user is never wrong.
Good point, curious to know if there are any plans for that implementation already.
Meanwhile, in your opinion what do you think we should opt for? arbitrarily down-exposed or sky with no hacks (that appears bright at default Blender exposure value) ?
Just a heads up, having a physically based sky lighting would be beneficial for a lot of things, consistency in particular, as you can see in this video:
And by the way the poll results show evidence that people prefer physical rather than good looking at default https://www.strawpoll.me/20587754
On top of that many suggested to go with the Sun Elevation 0 compromise.
Thank you for pointing this out, @troy_s. There are only a few ways to have an absolutely correct HDRI, and they all pretty much require adjusting in-engine compared to reference photos along with plates taken alongside the HDRI, colorchecker charts, and/or using photometric values obtained at the time of shooting the HDRI.
Getting photometric values out of Blender for double checking is difficult. The kinda-hacky way is to take the scene-linear RGB values, get the luminance from the RGB values, multiply that by 683, and then multiply the whole shebang by pi to get lux - but as @troy_s mentions above with the ISO stuff, it’s all a bit hocus-pocus magic.
Regarding UE4’s physical sky implementation - by default it is not absolutely correct, but scales to be correct with whatever intensity the sun is set at. It has nothing to do with auto-exposure. If you place the new UE4 sky in a scene with a sun with below-physical values, the sky will match that relatively dim sun, and likewise if you set the sun at a realistic value, the sky will scale up appropriately.
By default, UE4 exposure range doesn’t even go high enough to cover a physical sky - that has to be enabled under “extend luminance range” under project settings. My point isn’t to necessarily take sides with this info, I’m just establishing the context that, no, UE4 is not physically absolutely correct by default.
To just take this one step further for the benefit of the community and to disambiguate what it takes to have an accurate HDRI, here is how you would do it using UE4 (which has photometric-friendly tools available).
First off, you have to shoot the HDRI yourself. In addition to whatever method you’re using to get the HDRI, you need a plate with a macbeth chart that you also have a digital synthetic copy of. On top of that, it wouldn’t hurt to have a middle-grey ball (the expensive kind with an actual known value of “middle grey” that can be verified reasonably into an RGB equivalent).
After you shoot the HDRI, you need to do an incident light check with a light meter giving you lux of a surface you can recreate in UE4 - whether that’s a diffuse middle grey card or a piece of white paper or whatever. You then also need to do some spot checks of the sky, giving you luminance (cd/m^2 or nit if you like) over a few sample points - let’s say a dozen spot checks of the sky at various points.
Create and white balance the HDRI based on whatever methodology you want.
In UE4, you apply the HDRI and have a digital recreation of whatever object you did an incident check against. Up the HDRI intensity until you get the same lux coming off that bad boy. At the same time, you can also take a lower mip of the HDRI (to get a blurred, and thus averaged, result) and start doing luminance checks of the sky and see how the match up to your real-life luminance spot checks. If you can get the lux and cd/m^2 to match with your reference, you’re finally good to go.
It’s basically impossible to just take any old EXR that somebody claims is “correct” and trust it.