Out of Gamut: Cracking the Color Code of the IT8 Target

I set out to reverse-engineer the IT8 target—a tool used by photographers and scanners to create camera or scanner color profiles. Wolf Faust made them affordable, and each target comes with a reference file that describes the true color of every patch in XYZ coordinates, independent of any device.

Using a Python script, I converted this reference data into sRGB to visualize how those colors would appear on a D50-calibrated screen. That’s when the trouble started. Many of the colors—especially the vibrant reds, greens, and yellows that slide film is famous for—fell outside the sRGB color gamut. The math didn’t fail, but the sRGB color space turned out to be uncomfortably small. These out-of-gamut colors were clipped, distorted, or outright lost—confirming what many slide film photographers already know: digital just can’t quite match the original.

Even more revealing: the IT8 chart shown on Faust’s own website is idealized. Its colors don’t match the actual reference data. If anyone uses that image to manually correct scans by eye, they’re working from a misleading source.

Bottom line? Film still captures more color than digital screens can display. Profiles help, but some hues are inevitably lost in translation. On the practical level I would say you need to scan slides as bright as possible as its dark colors which have the most trouble captured fine. You can turn down brightness during the post as needed. The full article and link to the code are at

https://medium.com/full-frame/out-of-gamut-cracking-it8-calibration-test-target-ac4b7044cb73

Certainly will be happy to discuss and expect quite a bit of critique - which is perfectly fine.

1 Like

BTW: Camera sensors go beyond today’s screen gamuts too.

Enter Pointer’s gamut. Read more about it here

1 Like

What about the more recent HDR Screens appearing on new LapTops?

Usually I’m cautions about scanning slides bright as highlights are difficult to recover unlike shadows.

Very interesting…do you think it would be possible to produce a LUT out from the calibration photo dias?

If I have the raw value of a color from my digi cam and the correct measured value from a color rendered from the film, it should be possible to map it right?

That’s what the color profile builder like Lumariver does. That’s the whole intent of target.

1 Like

Ah that is so interesting! Thank you for this information. I will check this out.

Yes, that’s true. I try to scan just before transparent white clouds start to clip (judging in LR, not in camera)

I read the full essay on Medium. Very well conceived and written. I was captivated. Looking over the values from the color analysis I was puzzled by some of negative values as I was not expecting some of the colors to be that far out of the sRGB gamut. Overall, a great read and I enjoyed geeking out on your project.

1 Like

Some folks pointed out that I scaled the data too aggressively in attempt to normalize XYZ. I recalculated the chart and while white patches are now light grey rather than white, the out of gamut is still regular occurrence except one case. I calculated XYZ to ProPhotoRGB and there was no single out of gamut case. Here is how ProPhotoRGB untagged PNG looks like.:

Pretty dull, right? But now lets see what happens when we take this image and assign ProPhotoRGB profile in Photoshop. I also applied linear curve to lift whites.

Basically this is something to be expected, still its interesting to see how proper color profile changes the picture look. Now the color-aware Photoshop is interpreting file not as generic untagged, but as ProPhotoRGB. It displays it by recalculating the colors forth and back for my display. Luckily, i have my display calibrated.

That was very interesting, so I dug into this whole colour space thing. I wanted to understand how to interpret these tristimulus values from IT87 targets. I grabbed some reference values from the website, the Ektachrome E240220, if anyone wants to replicate that.

I plotted the visible spectrum and the colour spaces of sRGB and AdobeRGB. It was interesting to note that the human eye is not able to see pure colours, i.e. the values in the corners of the XYZ coordinate system. So a pure green cannot be perceived because a ray of light always addresses all the cones in the human eye and what we see is always a mixture of three primary colours.

I was interested in how @VladS calculated the sRGB values and why so many colours are cut off. I didn’t transform XYZ values into a specific colour space and gamma correct them. I simply took the values and calculated the normalisations to enter the values into the colour chart:

x = X / (X +Y +Z)
y = Y / (X +Y +Z)
z = 1 - x - y

I then plotted all the values from the reference file into a 3D plot, which is what the orange bubbles are.

Also interesting is the 3D representation of this plot:

As you can see is that indeed many many colours are out of gamut. I cross-checked many with @VladS’s values, but I think your first calculation of the sRGB values was actually correct. Because it’s not about scaling too aggressively. You are taking XYZ values from Wolf Faust, which are already scaled values. The XYZ values are ‘namely the CIE XYZ standard observer colour matching functions’ (CIE 1931 color space - Wikipedia), so you just need to normalise them using the equations above.

Interestingly AdobeRGB covers most of the Colors. However, I guess I now understand way more the hole colortheory. So thank you for the inspiration!

1 Like

Wow, I never thought of that!

Very interesting , indeed!

1 Like