Jump to content

Gamut of Reversal Stock


Recommended Posts

Hi Friedmann,

 

Perhaps you read the histograms correctly in the first place. But there is a spike at the top end of the red channel - which makes no sense to me in terms of image statistics - and I was interpreting this as what you meant by clipping - and reading that as clipping myself. But that spike must be an error of some sort.

 

But if you ignore that spike the red channel is still actually clipping because the curve still doesn't go down to zero (as per the normal statitisics of a pictorial image) - which means the red channel here is (from statistical probability point of view) being clipped - ie. the red channel is over exposed.

 

The green and blue channels could do with a little more exposure. But you can't do this in a single exposure without clipping the red channel further.

 

So the solution here is to do two exposures - one with a good non-clipped red exposure, and another with a good non-clipped green and blue exposure. You then copy the good red channel into the version with the good green and blue channels.

 

Carl

Link to comment
Share on other sites

I should just add that you need to do the exposure settings in relation to the illuminate - the reference signal - rather than an image. You might want to put clear film in the gate when you do this so as to take into account the base of the film.

Link to comment
Share on other sites

But there is a spike at the top end of the red channel - which makes no sense to me in terms of image statistics

 

I did both a screen shot of your image and downloaded the profiled version of such and compared the two. In the profiled version there is a spike in my histogram in more or less the same location as yours but my histogram shows the curve going back down to zero in a realtively smooth fashion after the spike - indicating a correct exposure.

 

In the screenshot (non profiled version) there is no spike in a histogram of such.

 

Carl

Link to comment
Share on other sites

 

Absolutely. In terms of colour management, irregardless of system, the CIE 1931 colour space acts as a standard reference colour space. It is by virtue of it's historically fixed meaning (and well defined meaning) that it forms the basis of many colour management systems. That is not to say it can't be improved. For example, equal distances in the 1931 space are not percieved as equal. An improvement on the 1931 space, in terms of perceptually equivalent distances, is the 1976 CIE LUV space. But by using the earliest well defined reference, as a standard reference, all other systems can inter-communicate with each other, by reference back to this standard.

 

The 1931 CIE space shows why any tristimulus gamut (eg. the gamut of RGB phosphors) can not represent all possible chromacitys, because no triangle (three points) within the horse shoe shape of the colur space (the visible spectrum) can cover all the chromacitys within that shape. Some new monitors being manufactured today introduce a fourth stimulus: yellow, in addition to red, green and blue, to better cover the horse-shoe shape.

 

Carl

Link to comment
Share on other sites

You know that there are shades of color that can not be represented by RGB but only by CMYK. Film uses the latter, all sensors I am aware of use a variation of RGB. How can this be solved? By multiple scans in subsets of the spectrum?

 

If you look at how colours in RGB space are transformed to colours in CMYK space, or vice versa there need not be any loss of information, as they are invertible:

 

CMY to RGB:

 

R = 1 - C

G = 1 - M

B = 1 - Y

 

and CMY to CMYK

 

K = min(C,M,Y)

C = C - K

M = M - K

Y = Y - K

 

But if one is talking about the physical colours (eg. RGB phosphors, CMYK inks on paper, CMY dyes in film) then each of these "light sources/modifiers" may not be able to reproduce the colours (light) that the others can. This works both ways. There are colours which RGB phosphors can render that CMYK inks can't and vice versa. This has nothing to do with any intrinisic limit in the colour spaces used, only that the mediums may not exploit the full range of the colour space (whether RGB or CMYK) in which they are represented.

 

We can always define an RGB/CMYK colour space with 1931 CIE reference points outside the visible spectrum that will cover the entire range of visible light within the visible range of the 1931 colour space. But there will be points within such RGB/CMYK colour spaces that none of the mediums (in terms of visible light) can ever touch because the reference points are outside the visible spectrum.

 

An RGB/CMYK colour space defined with points outside the visible spectrum allow for numbers (within the colour space) that don't correspond to any of the visible colours of the mediums you want to represent in that colour space.

 

A better alternative is to simply use the 1931 CIE colour space itself. You can define subsets of that space (called gamuts) to represent physical media such as RGB phosphors, or CMYK inks on paper, viewed in sunlight, etc.

 

Carl

Link to comment
Share on other sites

One of the white balance settings on the Canon is for tungsten light. So if doing the transfer using tungsten light (for which reversal film is designed) one shouldn't need a custom ICC profile - just a little fine tuning of the signal in the context of an existing profile (to which the tungsten balanced capture will already be correctly mapped). Another white balance setting is the flash. But since this will have a different colour to that for which reversal is designed, using flash would require either a custom ICC profile or a custom filter in the context of an existing profile.

 

A custom filter in the context of an existing profile is the easiest way to go, but at the expense of some slight loss of bandwidth. If intending to use flash in the transfer then suitable experiments comparing flash captured signals vs tungsten captured signals should establish pretty accurate parameters for the custom filter.

 

Carl

Link to comment
Share on other sites

IIRC, Canon CMOS have the highest bandwidth at 5500K.

The white balance settings are irrelavant for Raw photography anyway.

 

The white balance setting isn't irrelevant. It would be embedded in the raw file.

 

But I only mention the white balance in terms of using the Canon software where the white balance selection would allow the software to put the raw data into the correct ICC context for rendering - ie. the image won't render as orange. Using the white balance button in conjunction with the Canon software avoids the necessity of creating a custom ICC profile. It will be done for you.

 

But if using a daylight balanced light source - a white balance setting (of daylight) will be incorrect. There will be a minor mis-mapping of data when it comes to rendering. Might still look okay to the eye though.

 

When writing my own software I have to establish the equivalent of a white balance setting - a parameter that ensures that when the data is rendered to the screen - that it renders correctly. Whether I extract this from the raw file, or manually enter it in the code, or have it entered through a GUI, or written in a config file - it still needs to be done.

 

And if I use a daylight balanced light source (which is a good idea for bandwidth purposes) there is a double mapping I would have to take into account: 1. that the sensor data doesn't directly map to the rendering context (whatever it happens to be) and 2. that the film is being illuminated by light for which it wasn't designed.

 

Carl

Edited by Carl Looper
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...