What you need to do is calculate the physical size of the largest print you will make. THEN, base your scanning resolution on that. Your concern should be to achieve the correct number of pixels in both dimensions to fill the needed print real estate at the appropriate PPI for that size print.
An 8x10 photo print, held 13 to 20 inches from your eyes, can resolve about 240 PPI. A 5x7 print, held about 9 to 13 inches from your eyes, can resolve around 300 PPI. The farther away you are, the less your eyes can resolve. If you need to be two yards away from a big print, It only needs 1920x1080 pixel HDTV resolution to look good. But that same file will make a small photo quality print about 4.5 by 8 inches at 240 PPI. What changed? The size of the DOTS created FROM the pixels.
By PIXELS per inch, I mean original, uninterpolated pixels, made in the camera, or made in the scanner, or made in post processing software, from the original scene or print.
Scanners RECORD dots and scanner drivers create pixels from those dots. The dots have physical size, as in the scanner “seeing” 300 square cells per inch, horizontally and vertically across its bed. That’s what’s meant by ‘dpi’ in the scanner world. Also still in scanner world, ‘resolution’ means how many dpi MULTIPLIED by the size of the image in inches. If you scan an 8x10 print at 1:1 reproduction ratio, or 100%, AT 300 dpi, it will have a file containing 2400 by 3000 pixels.
BUT PIXELS ARE JUST NUMBERS. A pixel has no physical size or other physical attributes. It is simply a value representing color and brightness. A display or a printer can reproduce the pixels using fixed size dots evenly spaced (monitor) or dots of various sizes (AM screening as in offset printing), or same-size dots of various densities (FM screening as in inkjet printing). Displayed or printed dots always have a size, but they are created from numbers (pixels) in a file. Scanned dots always have a set size (dependent on scanner hardware), but they can be used to create files at a specific pixel resolution over a range of magnifications. A 600dpi 4x5 inch image and a 300dpi 8x10 inch image BOTH have identical pixel dimensions of 2400 by 3000 pixels… And you can resize them any number of ways, later, either interpolating to maintain a fixed resolution, or scaling without interpolation to maintain the complete set of pixels as unaltered.
In the scanning world, dots are used in scanner driver software to create pixels in a file.
In the computer world, the displayed image is nearly always interpolated (scaled up or down) to fit the magnification on display. At 100%, one file pixel equals one monitor pixel (which is made of three dots, red, green, and blue).
In the printing world, pixels are used in raster image processing hardware or printer driver software to create the sorts of dots required by the specific reproduction process. So NO, pixels and dots are not the same. The whole system is somewhat analogous to digital audio: A microphone turns sound into voltage, which is digitized into numbers for storage and manipulation. To hear the sound, the numbers are turned back into analog voltage and amplified so a loudspeaker can pump air to create sound. Digital audio is JUST A BUNCH OF NUMBERS (a whole lot of them!).