Why do these lab scans look so much better?

Hello everyone,

I’m relatively new to NLP and digital scanning, but I’ve had some really good results with the 4 rolls I’ve scanned so far (x2 colour, x2 B/W).

I decided to scan some old negatives that I’d had lab-scanned over the past 4 or so years, so see if I could get better results, and have higher quality scans if I ever wanted to print.

The B/W ones have been fine, but much to my dismay, I’m having real trouble with my colour scans - the lab scans just look so much better.

I’ve been using Noritsu, pre-saturation 3, border-buffer 10%

Sorry, I’m a new user so I couldn’t embed the images - had to use Imgur.

Here you can see this evening beach scene. The lab scan captures the feeling so much better. I tried fiddling around with colours etc but I couldn’t even begin to get anything that matched up.

Here’s another example. I managed to get the colours on my scan to be close to matching the lab scan, but their scan still looks better. The spiderweb is much clearer, and much more the focal point.

I know these kind of questions get asked a lot, but does anyone have any wisdom to impart? I’d appreciate any and all tips.

Hope you have a lovely day,
Sunday

Can you send some raw files? I can have a go at converting them and send you what settings I used.

To me it looks like NLP is trying too hard to make the image meet the edges of the histogram, so to speak.

Try playing with the WhiteClip and BlackClip settings, and also the LabGlow and LabFade sliders.

WhiteClip and BlackClip dictate where on the histogram of the unconverted raw file the curve to turn it positive should start and end, thus dictating what should be at the left and right of the histogram in the converted image.

LabGlow and LabFade can make the darkest black slightly lighter gray, or the brightest white slightly darker gray (difficult sentence to format sorry). The lab scanner has done this - look at the bottom right of the beach image, that “should” be black, but its “capped” at dark grey. This gives a soft look.

TLDR (what I would do):

Set the WhiteClip to something slightly negative (should make the whites in the waves look more normal)
Raise the LabFade slider (should fix the crushed look of the shadows)
Adjust the white balance (your scan of the beach image is a bit green for a sunset!)
Lower the contrast a little (are you on lab - standard? I would stick to that for the start, it gives me the best look with some tweaking)

Ed

edbr.xyz/

1 Like

Hi Ed,
Thanks for getting back to me.
I’ll play around with the things you suggested, cheers.
I’ve sent you some RAW files. I wasn’t sure how to send them on here, so I emailed them. I hope that’s ok.
I’m also wondering if maybe the negatives were damaged or something? They’d just been sitting in the camera shop for years.
Because I scanned another colour shot, and the results are so wildly different from the lab scan.

Hi @sunday,

@edbr gives some great advice here! I would definitely give everything he says a shot.

A few other things to consider:

  1. Yes, negatives can start to deteriorate over time, but if it has just been a few years and they have been stored properly, they should still be fine. And honestly, your conversions look to be in good shape and just need minor tweaks here and there, so I don’t think that is an issue.
  2. Generally, lab scans are done in the context of an entire roll, and this can provide important context that yields more neutral results from the scanner. If you do have access to the entire rolls for each of these shots, you might see more similar results between NLP and the lab scans if you use NLP’s “Roll Analysis” function. As an example, the green jungle scene is difficult for NLP to convert properly without context… in a single image analysis, the AutoColor will try to eliminate what is sees as “too much” green in the image, resulting in an image with too much magenta. You can manually fix this by removing magenta in the “tint” slider in NLP, or if you were to process this with Roll Analysis in the context of the roll it was taken in originally, in that case it should produce more neutral results.

Hope that helps!

-Nate

Hi @sunday

Just had a look at your RAWs, here is what I did with the beach one.


image

I also changed CurvePoints to “Smooth” and ColorMethod to “Highlight Weighted”.

Ignore the “EDBR-D” preset, that just is set by default to change whites to 2 and blacks to -2, just my preference and was immediately changed for this image anyway.

I think I got closer to the lab scan, though I will say that this was quite a challenging edit. @nate might find these scans interesting as edge cases! One part that stands out to me is the blue in the clouds, though I can’t explain why they are so different in the lab scan.

Here is NLP is really working with:

(using the NLP-Camera-None profile to make the image linear)

I notice that the data is bunched together quite far to the left. I would advise “exposing to the right” (brighter).

Here is an example of one of my recent scans (same profile).

Finally, here is what I would have done with that scan:


image
image
image

Ed

1 Like

Thanks, Ed.
That’s amazing. So helpful.
I appreciate you going out of your way.
I love the photo-taking part, I’ve still got a lot to learn about this part!
Cheers,
Alex

2 Likes