Date
1 - 2 of 2
sRGB and SPI versus Nuke
Jeremy Selan <jeremy...@...>
Troy,
This difference in srgb luts between the spi-vfx profile and nuke-default is a bit more complicated than a simple 1D vs. 3D white-point difference. I only have a moment to answer this question right now (I'd love to add a full page to the OCIO website about this soon). But in the meantime... The D50 vs. D65 issue is dwarfed by the 1D differences in the transforms. Allow me to answer a question you didnt ask. :)
The input device linearizations of both the nuke and spi-vfx profile attempt to map the input devices to a high-dynamic range float space that has plausible 'real world' values. (Often called scene-linear in the color community). Middle gray maps to 0.18. Specular data maps to >> 1.0 (think EXR).
But in mapping this high-dynamic range data to a limited dynamic range of an output display, you require some sort of tonemapping curve to create pleasing imagery. Josh Pines did a wonderful talk on this issue at Siggraph a few years ago. http://renderwonk.com/publications/s2010-color-course/pines/s2010_color_pines_slides.pdf
Film does this tonemapping intrinsically as a function of both the negative and print stocks, and digital camera workflows also emulate this to varying degrees. The spi-vfx profile emulates the tone mapping of a traditional film stock, which is why highlights >> 1.0 have a pleasant falloff. Even the spi-anim profile does a 1D approximation of a filmic tonemapping. The IIF ACES workflow also does tonemapping from scene-linear to display space. (Called the RRT, Reference Rendering Transform).
We must distinguish this HDR Scene Linear data from the low-dynamic range linear light coming from the display (display linear).
The nuke-default profile directly maps the scene-linear data to the linear light coming from the display (a low dynamic range device), side-stepping the issue of tonemapping. Thus, when one loads HDR data, the default srgb viewing transform appears to clip the highlights. (mapping them above 1.0) This is not an issue with the srgb curve per se (the curve is 'correct', properly emulating a 'display linear' transfer curve), but directly feeding HDR linear data to a low dynamic range display will not lead to pleasing imagery.
In terms of D50 vs D65, I'm not sure what the nuke-default color is targeting so you'd have to ask the developers on that front. (http://mysite.verizon.net/spitzak/conversion/ may be a good place to start).
For the spi-vfx, spi-anim, and IIF profiles, the use of the word 'srgb' is there to imply that the proper viewing device should be calibrated to the D65 srgb spec. But the srgb transfer curve does not explicitly appear in table form.
-- Jeremy
(cc'ing ocio-dev in case others have thoughts on this as well).
toggle quoted message
Show quoted text
This difference in srgb luts between the spi-vfx profile and nuke-default is a bit more complicated than a simple 1D vs. 3D white-point difference. I only have a moment to answer this question right now (I'd love to add a full page to the OCIO website about this soon). But in the meantime... The D50 vs. D65 issue is dwarfed by the 1D differences in the transforms. Allow me to answer a question you didnt ask. :)
The input device linearizations of both the nuke and spi-vfx profile attempt to map the input devices to a high-dynamic range float space that has plausible 'real world' values. (Often called scene-linear in the color community). Middle gray maps to 0.18. Specular data maps to >> 1.0 (think EXR).
But in mapping this high-dynamic range data to a limited dynamic range of an output display, you require some sort of tonemapping curve to create pleasing imagery. Josh Pines did a wonderful talk on this issue at Siggraph a few years ago. http://renderwonk.com/publications/s2010-color-course/pines/s2010_color_pines_slides.pdf
Film does this tonemapping intrinsically as a function of both the negative and print stocks, and digital camera workflows also emulate this to varying degrees. The spi-vfx profile emulates the tone mapping of a traditional film stock, which is why highlights >> 1.0 have a pleasant falloff. Even the spi-anim profile does a 1D approximation of a filmic tonemapping. The IIF ACES workflow also does tonemapping from scene-linear to display space. (Called the RRT, Reference Rendering Transform).
We must distinguish this HDR Scene Linear data from the low-dynamic range linear light coming from the display (display linear).
The nuke-default profile directly maps the scene-linear data to the linear light coming from the display (a low dynamic range device), side-stepping the issue of tonemapping. Thus, when one loads HDR data, the default srgb viewing transform appears to clip the highlights. (mapping them above 1.0) This is not an issue with the srgb curve per se (the curve is 'correct', properly emulating a 'display linear' transfer curve), but directly feeding HDR linear data to a low dynamic range display will not lead to pleasing imagery.
In terms of D50 vs D65, I'm not sure what the nuke-default color is targeting so you'd have to ask the developers on that front. (http://mysite.verizon.net/spitzak/conversion/ may be a good place to start).
For the spi-vfx, spi-anim, and IIF profiles, the use of the word 'srgb' is there to imply that the proper viewing device should be calibrated to the D65 srgb spec. But the srgb transfer curve does not explicitly appear in table form.
-- Jeremy
(cc'ing ocio-dev in case others have thoughts on this as well).
On Sun, Jan 8, 2012 at 6:45 PM, Troy Sobotka <troy.s...@...> wrote:
I was just looking through the LUTs and I noticed that the sRGB
transform for SPI looks like a 3D LUT, which would imply that you
folks are doing the _correct_ sRGB transfer while Nuke's isn't?
Is that a correct assumption?
Nuke's 1D transform would fail to deliver to the sRGB spec of D50
white point under D65 illuminant, correct?
Or am I completely delusional here?
With respect,
TJS
Brendan Bolles <bre...@...>
On Jan 9, 2012, at 6:17 PM, Jeremy Selan wrote:
Nuke doesn't concern itself with illuminants or color primaries, only worrying about the response curves. It also doesn't take any kind of display profiling into account (with the possible exception of the TrueLight node).
This is by design. Transforming an image from one illuminant to another will involve channel cross-talk, which can lead to issues like your grain no longer being channel-independant. It can also introduce negative pixel values which cause other problems. Whatever advantage there might be to using the extra color science quickly disappears in real world production.
In my experience, exactly how you linearize something isn't crucially important. It's mainly important that you just have some sort of 1D log2lin or sRGB2lin going on. And VERY important that you can invert it so that you get back to the original source footage (plus the dinosaur you've added). In visual effects, it might not even matter that much if your monitor is all out of whack because as long as the CG matches the plate, it doesn't matter how you're looking at them. So this is why the standard Nuke setup doesn't worry too much about color primaries.
Now, if you want to preview the composite on a monitor and accurately see what it will look like in the theater, that's when you want to worry about color primaries and everything else. And fortunately at that point the VFX is already done, so you can introduce channel crosstalk and non-invertable LUTs as much as you like.
Brendan
In terms of D50 vs D65, I'm not sure what the nuke-default color is targeting so you'd have to ask the developers on that front. (http://mysite.verizon.net/spitzak/conversion/ may be a good place to start).
Nuke doesn't concern itself with illuminants or color primaries, only worrying about the response curves. It also doesn't take any kind of display profiling into account (with the possible exception of the TrueLight node).
This is by design. Transforming an image from one illuminant to another will involve channel cross-talk, which can lead to issues like your grain no longer being channel-independant. It can also introduce negative pixel values which cause other problems. Whatever advantage there might be to using the extra color science quickly disappears in real world production.
In my experience, exactly how you linearize something isn't crucially important. It's mainly important that you just have some sort of 1D log2lin or sRGB2lin going on. And VERY important that you can invert it so that you get back to the original source footage (plus the dinosaur you've added). In visual effects, it might not even matter that much if your monitor is all out of whack because as long as the CG matches the plate, it doesn't matter how you're looking at them. So this is why the standard Nuke setup doesn't worry too much about color primaries.
Now, if you want to preview the composite on a monitor and accurately see what it will look like in the theater, that's when you want to worry about color primaries and everything else. And fortunately at that point the VFX is already done, so you can introduce channel crosstalk and non-invertable LUTs as much as you like.
Brendan