I set out to reverse-engineer the IT8 target—a tool used by photographers and scanners to create camera or scanner color profiles. Wolf Faust made them affordable, and each target comes with a reference file that describes the true color of every patch in XYZ coordinates, independent of any device.
Using a Python script, I converted this reference data into sRGB to visualize how those colors would appear on a D50-calibrated screen. That’s when the trouble started. Many of the colors—especially the vibrant reds, greens, and yellows that slide film is famous for—fell outside the sRGB color gamut. The math didn’t fail, but the sRGB color space turned out to be uncomfortably small. These out-of-gamut colors were clipped, distorted, or outright lost—confirming what many slide film photographers already know: digital just can’t quite match the original.
Even more revealing: the IT8 chart shown on Faust’s own website is idealized. Its colors don’t match the actual reference data. If anyone uses that image to manually correct scans by eye, they’re working from a misleading source.
Bottom line? Film still captures more color than digital screens can display. Profiles help, but some hues are inevitably lost in translation. On the practical level I would say you need to scan slides as bright as possible as its dark colors which have the most trouble captured fine. You can turn down brightness during the post as needed. The full article and link to the code are at
Very interesting…do you think it would be possible to produce a LUT out from the calibration photo dias?
If I have the raw value of a color from my digi cam and the correct measured value from a color rendered from the film, it should be possible to map it right?
I read the full essay on Medium. Very well conceived and written. I was captivated. Looking over the values from the color analysis I was puzzled by some of negative values as I was not expecting some of the colors to be that far out of the sRGB gamut. Overall, a great read and I enjoyed geeking out on your project.
Some folks pointed out that I scaled the data too aggressively in attempt to normalize XYZ. I recalculated the chart and while white patches are now light grey rather than white, the out of gamut is still regular occurrence except one case. I calculated XYZ to ProPhotoRGB and there was no single out of gamut case. Here is how ProPhotoRGB untagged PNG looks like.:
Pretty dull, right? But now lets see what happens when we take this image and assign ProPhotoRGB profile in Photoshop. I also applied linear curve to lift whites.
Basically this is something to be expected, still its interesting to see how proper color profile changes the picture look. Now the color-aware Photoshop is interpreting file not as generic untagged, but as ProPhotoRGB. It displays it by recalculating the colors forth and back for my display. Luckily, i have my display calibrated.
That was very interesting, so I dug into this whole colour space thing. I wanted to understand how to interpret these tristimulus values from IT87 targets. I grabbed some reference values from the website, the Ektachrome E240220, if anyone wants to replicate that.
I plotted the visible spectrum and the colour spaces of sRGB and AdobeRGB. It was interesting to note that the human eye is not able to see pure colours, i.e. the values in the corners of the XYZ coordinate system. So a pure green cannot be perceived because a ray of light always addresses all the cones in the human eye and what we see is always a mixture of three primary colours.
I was interested in how @VladS calculated the sRGB values and why so many colours are cut off. I didn’t transform XYZ values into a specific colour space and gamma correct them. I simply took the values and calculated the normalisations to enter the values into the colour chart:
x = X / (X +Y +Z)
y = Y / (X +Y +Z)
z = 1 - x - y
I then plotted all the values from the reference file into a 3D plot, which is what the orange bubbles are.
As you can see is that indeed many many colours are out of gamut. I cross-checked many with @VladS’s values, but I think your first calculation of the sRGB values was actually correct. Because it’s not about scaling too aggressively. You are taking XYZ values from Wolf Faust, which are already scaled values. The XYZ values are ‘namely the CIE XYZ standard observer colour matching functions’ (CIE 1931 color space - Wikipedia), so you just need to normalise them using the equations above.
Interestingly AdobeRGB covers most of the Colors. However, I guess I now understand way more the hole colortheory. So thank you for the inspiration!