@Joanna
Well I will tell you a comment from one of the museums’s photographers where I used to work for seven years. When we walked through a new budget maybe in 2013-14, we found that the next biggest cost except wages was storage and the storage of our digitized images.
The old scanning standard was TIFF and we had old glass sheet image originals that scanned landed on 1 GB per image. At that time the town had an agreement with Volvo IT that took around 100 SEK or 10 U$ a year to store one single picture like that. The storage prize was set with type Office-files in mind and not for images.
The photographer I refer to then said something like: “Well than we might not upload so many images, because then we might get sacked!”
Our photographers put their RAW in a special TEMP-folder on their local machines and converted their RAW into DNG with a special action in FotoWare Fotostation (a client to FotoWare’s Enterprise DAM). Then they developed the files in Lightroom. That DNG held both the RAW-data, and the changes the photographer had made to it and as a result of that even an embedded fully developed JPEG with all the pixels preserved. They also added XMP-metadata with FotoWare Photostation.
**When these files where developed they exported them to the systems automation hub that matched the images image numbers with the corresponding image numbers in the museums SQL-based “photometadata database” and if there was a match the automation hub converted the SQL-data to XMP-data via the systems XMP-schema and pushed the data into the DNG-files XMP-headers.
Then the DNG-files were transferred to the “Master File Archive”. After that the system copied the embedded full size JPEG from the DNG and transferred the XMP-metadata to the JPEG before it was transferred by the hub to the “Delivery File Archive”.**
The last and very important step created a second more light weight system JPEG with 1280 pixels at the long side with XMP-master metadata that was sent to the Sketch File Archive. It was that file representation that carried all the XMP-masterdata and it´s considered to be a good practise not to pump around too heavy end inefficient files in the Fotoweb (not misspelled just spelled in Norwegian Especially big TIFF-files were sometimes painfully slow to process when updating metadata because TIFF seems to lack a distinctly defines XMP-header that DNG and JPEG have.
What about mismatches in metadata between the files. Well the metadata in both the Master- and Delivery-files were only updated in the archives once and that was done in the initial process of uploading the Master-File. If a webuser wanted a copy of a Master or a Delivery file the metadata was copied on the fly during the download process from the metadata master, which was always the light weight Sketch files.
So Joanna, I´m a very big fan of DNG especially in DAM-systems of these reasons despite their advantage of being single standardized RAW-format in our chaotic proprietary RAW-file world. There is nothing wrong about DNG. The problem is that the proprietary camera and converter world won´t let it shine because they have their own interests in not doing so.
If it had been possible even in Photolab just to open any DNG without stopping it because the file it was made from happened not to be supported yet by DXO, then it could really begin to work as a universal RAW in a real sence.