What resolution to scan?

Just curious what resolution people use in Silverfast to scan negatives.

Do you just accept the photo quality 300 ppi and set the resolution slider?

With a Nikon V i set the resolution to 4000 in Silverfast and Nikon Scan software.
With an Epson V850 i set the resolution in Silverfast to 2400 ppi and slider default of photo quality 300 ppi.

I am a little bit confused about ppi and dpi. Are they not the same, number of pixels/dots per inch?

When i scan with Nikon Scan or Epson Scan software the real resolution that i have set is exactly the same shown at the left bottom in Photoshop.
However when i scan with Silverfast they all say 300 dpi at the left bottom in Photoshop, no matter if i have set the resolution to 2400 / 4000 in Silverfast. Is this an incorrect data in the file information?

I am scanning color negatives as 48 bit TIF with iSRD and b/w as 16 bit TIF with xSRD.

I want to get the best quality, so i wonder. I am digitalizing the negatives and will only show it on pc / tv, not printing them, but still would like to have good quality on prints whenever someone would like that.

Thanks for your help.

They are similar and different.

  • ppi in an image means pixels per inch, each pixel containing rgb colour information.
  • dpi means dots per inch and is used in scanning and printing. Dots are combined to form a pixel in scanning, while a pixel is brought on paper, not by rgb ink droplets, but by several droplets of cyan, magenta, yellow and black inks (with a basic 4-colour printer). Scanners scan with sensors that are sensitive to different colours and white light, or with sensors measuring brightness of all colours provided by different lights e.g. red, green and blue).
  • scanning software often mentions dpi for the target/output file and the scanning resolution is called low, medium or high, meaning that the scanner selects appropriate dpi values depending on your choice.

Many scanners’ tech specs mention very hight dpi values and sometimes, the quality of the scans improves with higher values, but not necessarily the whole way. While a 4800 dpi scan might provide more output pixels, scanning at lower values can produce output files that are equally good on a technical level when prints of equal size are viewed at reasonable viewing distances. This means that you should test different scanning resolutions (dpi) and see at what values the files look best vs. a) the drive space you are willing to dedicate to an output file or b) maximum print size.

What you need to do is calculate the physical size of the largest print you will make. THEN, base your scanning resolution on that. Your concern should be to achieve the correct number of pixels in both dimensions to fill the needed print real estate at the appropriate PPI for that size print.

An 8x10 photo print, held 13 to 20 inches from your eyes, can resolve about 240 PPI. A 5x7 print, held about 9 to 13 inches from your eyes, can resolve around 300 PPI. The farther away you are, the less your eyes can resolve. If you need to be two yards away from a big print, It only needs 1920x1080 pixel HDTV resolution to look good. But that same file will make a small photo quality print about 4.5 by 8 inches at 240 PPI. What changed? The size of the DOTS created FROM the pixels.

By PIXELS per inch, I mean original, uninterpolated pixels, made in the camera, or made in the scanner, or made in post processing software, from the original scene or print.

Scanners RECORD dots and scanner drivers create pixels from those dots. The dots have physical size, as in the scanner “seeing” 300 square cells per inch, horizontally and vertically across its bed. That’s what’s meant by ‘dpi’ in the scanner world. Also still in scanner world, ‘resolution’ means how many dpi MULTIPLIED by the size of the image in inches. If you scan an 8x10 print at 1:1 reproduction ratio, or 100%, AT 300 dpi, it will have a file containing 2400 by 3000 pixels.

BUT PIXELS ARE JUST NUMBERS. A pixel has no physical size or other physical attributes. It is simply a value representing color and brightness. A display or a printer can reproduce the pixels using fixed size dots evenly spaced (monitor) or dots of various sizes (AM screening as in offset printing), or same-size dots of various densities (FM screening as in inkjet printing). Displayed or printed dots always have a size, but they are created from numbers (pixels) in a file. Scanned dots always have a set size (dependent on scanner hardware), but they can be used to create files at a specific pixel resolution over a range of magnifications. A 600dpi 4x5 inch image and a 300dpi 8x10 inch image BOTH have identical pixel dimensions of 2400 by 3000 pixels… And you can resize them any number of ways, later, either interpolating to maintain a fixed resolution, or scaling without interpolation to maintain the complete set of pixels as unaltered.

In the scanning world, dots are used in scanner driver software to create pixels in a file.

In the computer world, the displayed image is nearly always interpolated (scaled up or down) to fit the magnification on display. At 100%, one file pixel equals one monitor pixel (which is made of three dots, red, green, and blue).

In the printing world, pixels are used in raster image processing hardware or printer driver software to create the sorts of dots required by the specific reproduction process. So NO, pixels and dots are not the same. The whole system is somewhat analogous to digital audio: A microphone turns sound into voltage, which is digitized into numbers for storage and manipulation. To hear the sound, the numbers are turned back into analog voltage and amplified so a loudspeaker can pump air to create sound. Digital audio is JUST A BUNCH OF NUMBERS (a whole lot of them!).

Thank you for the in depth information