Matching Lab Colors without success

So just getting into scanning my own stuff here and before I dive into getting the ultimate sharpness I’d like to get my colors down, so far they’re just not getting where I want them. I have all of my current negatives scanned professionally at the Indie Film Lab so I have comparisons, for instance this one I just can’t nail.

As compared to what I got from the lab:

For the light source I used an ipad, without night shift on, seeing that it’s recommended for a neutral light source. I also have a CRI 95+ rated video light panel coming in two days.

Is there anything I’m missing? I shoot the negative, white balance to the border, crop the thing, and then run it through NLP. I’ve tried various color models but nothing seems to make a big difference. I’ve also tried reducing the border to 0 since I’m cropping it beforehand anyhow.

I have the negative raw here in case someone is kind enough to give it a shot on their system:

http://share.xsnudes.com/Studio+Session-006.RAF

I have for the first time ever negatives coming that aren’t professionally scanned and now I’m nervous that I won’t be able to master this myself.

You can ask yourself the following questions:

  • do I want/have to match the colours to the lab’s?
  • do I want/have to match the colours to what (I remember) they were when the shot was taken?
  • do I want/have to get correct colours or the colours that I like?

Whatever your answers might be, NLP has options and sliders that let you modify the appearance of an image in a wide range. Getting to know what you can do needs some practice though. Learn more about it starting here: Getting Started with Negative Lab Pro | Negative Lab Pro

Yes, everything that Digitizer says, before I read the text I assumed that the top image was the professional scan, a little over-exposed (easily corrected) but natural looking colours whereas the lower image is a bit warm for my liking. The results from colour negative, from a professional lab or even if you were printing from them in the darkroom, are necessarily subjective in a way that scans from colour transparency never are, though you are free to be subjective with those too.

Thanks for the replies, so far.

I’ve gone through all the settings multiple times and I’ve found myself unable to get the same look, I’m pretty familiar with Capture One and Lightroom so tweaking this sort of thing isn’t unfamiliar to me. It just doesn’t seem possible to get without making small changes to shadows, mids, and highlights for every different image, and even then it doesn’t look the same. To me it seems like off the bat it’s different to them on their Nortitsu/Frontier scanners, and maybe there’s nothing I can do about it myself.

My goal is to be able to match the colors from the lab so I can understand how they do it, it’s not necessarily where I want to always go but they are doing some things better than I’m doing them right now. In some images the greens pop more than I can get them, in this one the red bathing suit looks better from the lab, stuff like that.

I’ve noticed on youtube a lot of people use NLP just for conversion and then they clean things up in Lightroom with the copy they generate, maybe I’ll have to use that workflow? I was just hoping things were a bit more lab like off the bat.

Just wondering if you know what scanner your lab does actually use for your scans - Noritsu or Fuji Frontier?

You’ve probably seen this but some information on the differences:

Hi @xsnudes ,

Yes, it is possible to get your camera scan very close to your lab scan with just a few simple adjustments in Negative Lab Pro, and tweaking in Lightroom’s HSL panel.

The most important adjustment above is adding yellow and magenta to the “highs” color tab. This gets the color balance closer to this particular lab scan. Using “LAB - soft” and adding both “Lab Glow” and “Lab Fade” helps bring the tonality closer. And finally, to match the color of the red suit, I’ve used the HSL panel in Lightroom (on the original RAW) using the tooltip and dragging on that part of the photo until it is closer.

Here are the HSL settings I ended up with.

It’s up to you, but once you learn the tools in Negative Lab Pro and how they work, it is very quick to edit or match scans. It took me about 30 seconds to match your scan… but of course, I have a slight advantage being the developer!

Hope that helps!

-Nate

They use both, you can choose, and apparently I chose the Noritsu this time.

1 Like

Thanks, that does help a lot. It’s a bit mystifying to me how a Noritsu scan goes in this direction so far from how NLP interprets the negative, but as all this stuff is subjective I’ll just have to find colors that work for me.

1 Like

I don’t see any discussion here about color management. Are you using a color managed and calibrated monitor, profiled with a colorimeter (puck) and one of the calibration software packages, such as Spyder or BasiCColor Display? Have you set Lightroom and Photoshop to a color space of your own choosing, to respect profiles from other software, and the software behavior when opening files with no profile? If you aren’t on a color calibrated and profiled monitor, you will labor endlessly trying to make your color predictable and accurate. You simply can’t do it otherwise and you will waste hours, days, months and more in frustration. I know from personal experience!

I’m going to skip tips and thoughts about this problem technically cuz that’s well covered by Nate and others above. You’ll get there.
I just want to stress one thing only: you mentioned that you’re afraid you wont’ be able match the lab scan results yourself on your first set of negatives coming back without scans.
You need to rid yourself of that preconceived notion, because they’re negatives and the labs scans are one interpretation. Going forward you’re going to receive negatives for which you will NOT HAVE a lab reference scan “colouring” (for lack of a better word) your idea of what the image “should” looki like. You’ll be starting fresh. You’ll get it to look like it should to you. Getting to know the tools more will help get there quicker but not having something that you need to arbitrarily match will go a long way to removing this “problem” altogether. You will scan and edit your images to your satisfaction (I’m assuming you won’t stop before you’re satisfied.) :slight_smile:

To me the point of matching the lab scan was the ability to understand the software well enough where I could match what other people have done, before I put my own spin on things. I have since scanned other stuff from the lab (and stuff that’s never been scanned by the lab) and actually have gotten better results than what they did, at least to suite my own taste. I think what I got most out of Nate’s example is that you have to sometimes take things outside of NLP and keep tweaking, where before I thought I could get all the way there just in NLP.

I have to say that sometimes NLP seems to come up with extremely strange colors but I tweak them to bring them back to what I think is normal. Like neon yellow for skin. Other times it does a great job and things look better than I could imagine. It just took some getting used to that you take a picture of something in reality and the positive doesn’t end up looking close to how it looked when you took the picture, but that just seems like the nature of converting negatives to positives.

1 Like