New Sky Texture

Good then,
Ton said that if i don’t agree with the whole Cycles module members then i have to reach him out, which is exactly what i’m going to do if @sergey won’t answer in a day or so from now.

Hi. I read there that decisions are being made according to the vote and opinion of users. Just in case, I feel obliged to clarify that there may be users like me who do not have the slightest idea of what is being discussed here or technical issues. Therefore perhaps, some like me simply abstain from casting a vote or opinion, due to ignorance about the subject under discussion.
In addition it should be noted that in this forum there are many developers and users with a lot of technical knowledge, it is not a forum for Artists or normal users.
Just saying this, so that the probable bias of the result of those votes can be noticed.

On the other hand, I wonder, is it really very difficult to implement a preset system or something like that that satisfies both sides?

4 Likes

Apparently it seems to be so.
I liked the Sun Elevation 0 solution, but Cycles devs seem not to listen or to accept it.
Having read a lot of interesting scientific papers and having implemented (thanks to the help of really awesome devs) this new Sky Texture, i can’t accept the sky values being corrupted by someone else at all.
The sky is blue after all.

5 Likes

Mmm, I regret that this is so.

As I said, I don’t quite understand what the discussion is about. So if some say 90º and others 0º, I would choose 45º (do not take me seriously, only noting that I would choose the best option that reconciles both sides)

You are in disagreement with @brecht and are trying get your way by dragging other cycles team members into this argument clearly against their will.

You sure that’s the way to go here?

16 Likes

This is simply exceptionally challenging to implement in a properly managed system. Bear in mind that Blender is actually ahead of some tools in this regard, while still struggling.

When a piece of software doesn’t manage things, and makes horrible assumption about BT.709 primaries etc. it’s trivial. It’s also broken.

Other software, that is also so horribly managed as to make the result meaningless, bear in mind. Like so mismanaged as to be garbage.

2 Likes

But a lot of Blender users need them affecting the exposure, and/or an EV slider, for every camera. And I think we should already agree on this. It is really that hard to do that you have to come with an essay about the future of photography where this doesn’t makes sense just for making our arguments irrelevant? Again, we first catch up with the workflows of today before going revolutionary. Or maybe your point is that Blender doesn’t have to change because it already has the best approach on setting up illumination and camera effects?

I see setting Blender to have autoexposure on by default, or relying on compositing or global sliders for tuning the exposure, as worse solutions than having a camera for controling every aspect of the shoot for every shoot, al least from a UX perspective, and is far worse for casual Blender users once the get a render that is not what they expect because of arbitrary, hidden defaults, hardcoded values and not existing features, and don’t have a clue about how to change it without going to forums and such. I think we can make Blender more user friendly, for real.

1 Like

The thing is, for the average user, they just work. And we are targeting the average user in our arguments, or not? I want Blender to be ahead, but not to be unmanageable or less capable of delivering than other software. I think we have people here capable to take challenges, or to code checkboxes for options that can introduce inaccuracies while making life easier for all of us.

1 Like

They don’t “Just Work” any more than people getting confused about alpha or any other thing. The “Just Work” typically means “Wow I don’t have a clue about this” for a certain audience subset, and another audience educates itself.

That “Average User” is a fictional entity, and even if they did exist, they’d be as valuable as a token from a casino.

In terms of actually doing shit that matters, instead of waxing lyrical on forums, folks need to understand that things like:

  1. Basic file encoding is broken.
  2. Basic alpha handling is broken.
  3. Etc.

There’s lower hanging fruit than magically deciding to try and chang all of the scales to some other magical system that doesn’t Just Work any more than the garbage broken mess prevalent in plenty of other software.

I’ll stand with experienced craftspeople who can look past a UI issue and cut to the chase of having a consistent series of responses across the software.

3 Likes

I know very well this is going nowhere. That’s the reason I want to remember that, maybe, setting the default sun position as 0 in a non-hardcoded Nishita Sky would close this very discussion. I personally would ask for having a exposure value that means something, because if the following is accurate:

I don’t know why can’t we introduce the EV in the exposure control instead of tuning around a meaningless default exposure of 0.

2 Likes

I usually do that as well, but not today.

What would be a meaningful default exposure?

I think it’s about exposing EV rather than hiding it by a cryptic ‘0’ exposure. Some sensible default exposures might be exposing direct day sunlight correctly, or daylight shade, or any number of other values. Exposing controls seems just as critical as choosing a sensible default, though.

Yes, but if not 0, what then should be the default exposure?

Even if the default exposure doesn’t change, you could simply show the 9.416 EV that 0 exposure actually represents. Then the discussion of what alternative defaults might be can start.

It’s the unit, rather than the value, that is being emphasised.

I’m not sure I understand - you’re saying that it would be more sensible to show “9.416 EV” by default instead of 0? And that users would understand what that’s supposed to mean?

1 Like

I think showing EV, a common and simple concept, is likely better than showing 0 and having that represent 9.416 EV under the hood. If users don’t understand what EV means, I’m guessing the first search term might be ‘blender exposure EV’ and they’d find a suitable answer within 30 seconds. Showing 0 exposure as 9.416 EV just accentuates how arbitrary it is, if this is correct (I didn’t look at the source of this conversion calculation)

There just needs to be some visible and accessible link between real life light values and what I’m doing in engine. Hiding that behind clean numbers helps no-one.

6 Likes

I too wonder about the source of the calculation - unless I’m mistaken, the equation on Wikipedia to derive the EV requires the ISO value of the film/sensor as input, which at this point is undefined.

Yep I agree. I think it’s important to distinguish between camera EV and ‘scene’ EV. They’re related but distinct things. Either way, if we were to specify a physical camera, there would only be one correct answer, within the limits of light loss of the virtual camera system (which could probably just be ignored for now)

I think ISO is defined in relation to a physical light value and EV (shutter speed and f stop in this case), so it seems like a circular definition. :man_shrugging: I’d need to look into it further

If you have film or a physical sensor with limited dynamic range, there is a definition of how to measure its ISO. In our case, the role of film or sensor is being played by colour management, not the renderer core.

1 Like