3D lut size and bit depth questions


Est <rame...@...>
 

Hi All,

I recently started using OpenColorIO and I have some questions:

In my current code, I'm using a 32x32x32 lut, just for the display,
not for processing.
Is it a good size? How do you choose a 3d lut size?

In the spi-cg config, what's the difference between vd16 and vdf?
I see they have the same parameters and lut. The only difference is
the
bitdepth. I haven't read all the code yet but it doesn't seem that
the bitdepth is used when converting colorspaces.

Thank you.

Est.


Jeremy Selan <jeremy...@...>
 

Est,

A 32x32x32 cube lut is a totally reasonable size, it's what we happen
to use internally at Imageworks.

Some other popular sizes include 17x17x17 (flame), and 33x33x33 (a few
commercial color graders). There are probably some clients that use
larger sizes, but as you get to larger sizes there can be noticeable
performance degradation. (Which is often image content dependent, has
anyone else noticed that!?!)

There's currently no way to query a recommended lut size through the
OpenColorIO API. In the case of a transform having a 3d lut there may
be times where one particular size is 'native', but on many other
occasions there wont be native resolution. The question for whether
or not to add such a function is based on whether a 3d display
transform size is most often tied to a particular color configuration
(in which case OCIO is the right place), or if it's more appropriately
tied to a client or plugin (in which case and API call does not belong
in OCIO).

Thoughts?

Good observation on the vd16 - vdf issue. That's a bug with the
spi-cg profile, and we'll have it fixed in the next few weeks. (Both
profiles will be getting a huge upgrade). What should be happening
is that they both reference the same LUT, but that the vdf profile
should be using linear interpolation, and the vd16 profile nearest.
(This table has 17 bits of entries, so there is no quality difference
using nearest neighbor for 16-bit int data, and the performance is a
lot better).

You are completely correct in that the bit depth tag is not used when
converting colorspaces. It is merely a tag that the UI can use. (All
of this will be explained in the docs in the near future). :)

-- Jeremy

ps - I'll be out of town for the next week, so my email response time
may suffer.

On Thu, Sep 23, 2010 at 10:48 AM, Est <rame...@gmail.com> wrote:
Hi All,

I recently started using OpenColorIO and I have some questions:

In my current code, I'm using a 32x32x32 lut, just for the display,
not for processing.
Is it a good size? How do you choose a 3d lut size?

In the spi-cg config, what's the difference between vd16 and vdf?
I see they have the same parameters and lut. The only difference is
the
bitdepth. I haven't read all the code yet but it doesn't seem that
the bitdepth is used when converting colorspaces.

Thank you.

Est.


Est <rame...@...>
 

Hi Jeremy,

Thank you for your replies.
Everything is clear now.

I think that the lut size is an implementation detail of the client
and a recommended lut size function wouldn't have much use
as most of the clients / plugins would use a fixed size lut.

Est.