Exporting NLP meta data, so that Mac Photos reads it

I have scanned some 10,000 negatives to which I have added quite a bit of NLP metadata.
The final destination is Mac Photos.
Using NLP export, some fields show in Photos as part of the caption but others are lost
Specifically:

  1. Film format (35mm, etc) is lost
  2. Lcation (Sublocation, City, State, Country) is written to EXIF but is not read by Photos
  3. Date is written to EXIF “Date/Time Original” but l didn’t find which EXIF field it should be copied to so that Photos reads it

Any help would be greatly appreciated.

The Fotos.app is peculiar about metadata.

Normally, not much is displayed, and if you press command-I, a separate window will provide more information. Upon import, no metadata should be lost, but is buried in Foto’s database/catalog and some of it will come back upon export. So far so strange.

The best way to find out how to make metadata from visible in Fotos is … to try.
To test, I set metadata in the source app and see what Fotos will show. Here’s an example checking DxO PhotoLab and Fotos.app. You can test the same things with LrC/NLP metadata. Note that I added metadata according to the respective field name. Like this, I can also see what fields are used - and how they might be localised.

1 Like