Unwanted Virtual Copies and Moving DxPL edited images (DOPs) between System - Revisited

Please make sure to be precise (not more detailed)

DPL can be set/used to act on .dop sidecars in several ways

  • auto import
  • auto export
  • manual import
  • manual export
  • combinations thereof

DPL can be set/used to act on XMP (data in files or .xmp sidecars)

  • auto sync (not separate for I/O)
  • manual import
  • manual export
  • combinations thereof

I suppose (have not tested it) that XMP is not involved in the creation of virtual copies, which means that only the image files and/or the respective .dop sidecars cause VCs to appear…which nevertheless creates quite a bunch of scenarios that require testing.

Creation of VCs = f (DB, image file, .dop sidecar, app settings, manual actions…)

Good luck!

1 Like

@platypus I wrote the post as I conducted the tests with no preconceived notion about what I might “discover” hence, the detail.

I could (and ultimately will) rewrite the outcome of any tests that I conduct.

Because I use the defaults I look forward to your tests with the other combinations which will give us an insight into both what those combinations might help/hinder and what happens on the MAC platform!

I look forward to your results, my settings are

and the DOP sidecar settings will remain as they are for the remainder of my tests. They will have a material impact of the way things work because some of my tests produced metadata from the DOP but only after an ‘Import’.

Hence. if you run with the ‘load settings’ OFF (unchecked) you won’t know/see what happens automatically but will ultimately get the same results as I did with my automatic DOP load not refreshing the metadata but the ‘Import’ doing that job, because the only way of getting new DOP data is via the ‘Import’ with the option unchecked.

You are welcome to check all the combinations but it should be easy to extrapolate them from my results with the options at their default settings.

The xmp sidecar and embedded (RAW and JPG) images is a different situation and I will test a number of scenarios but not every combination, what I don’t test you are welcome to fill in the additional detail.

THE ONLY THING THAT GOVERNS VCs IS THE Uuid BUT the state of the image might, i.e. not just the DOP, might cause a different reaction.

EXCEPT that what seals the fate of the Uuid is that first import into the second system (System B) and since all the data is new to System B what state of that data could affect the import!?

The answer to that is where that data resides and where the database resides and those are yet more tests

  1. Data moved between system on a USB drive (Flash or SSD or HDD or …)
  2. Two systems attempting to access one set of data.
  3. Two systems attempting to access one set of data with just one database

Please add other scenarios to the above list and I will then be looking for volunteers!

Good of you to enumerate the various combinations now how about helping by testing some!

Update:-

If you ever want to synchronise your systems DO NOT TURN OFF the automatic DOP load. If you do you fall into the trap discussed earlier. The image is discovered on B but no edit nor metadata is imported but a Uuid is allocated by System B.

Now the fate of the image is sealed and when you ‘import’ from the DOP you are going to get a guaranteed VC without really trying too hard.

So another Rule for Synchronisation to work the ‘Load from DOP’ option must be active but that issue (discovery of an image must be with a DOP) was discussed at the beginning of my post courtesy of a reminder from @Aearenda !

Well, I’m not going to test all the variations because I don’t work on more than one computer, something that is not meant to happen anyway. Maybe DxO will add some functionality that allows the sync of databases and/or files across computers for a seamless user experience in a future release…

Congrats to all of you having the will and stamina to try nonetheless.

@platypus What the tests show is that arguably DxO could claim the functionality is there already!

@JoJu won’t like it because I have made some tentative additional tests and they show certain issues once Metadata is taken into account, i.e. currently in the sidecar in my case, so there will be caveats but if you have a laptop and a main machine then

  1. With two copies of exactly the same version of DxO.
  2. With AS(OFF) but the DOP options left to default.
  3. With two separate databases and two separate file systems (currently mine are identical for the directory under test but I don’t believe that is even vaguely important)
  4. Synchronisation is not only possible, it is also possible to pass edits backwards and forwards with only a little care in real time!
  5. But that might start to come unglued if the metadata is carried around using the correct metadata handling instead of DOPs, but I am not yet sure of the exact boundary conditions where the well behaved system that I was happily using earlier becomes more complicated (or not).

I have no problems updating XMPs via my dam if I make changes through that, new keywords as done today. The data was copied to the laptop from the desk pc and later found out some fungus identities. Added to PL ok and it updated the information it held correctly and displayed the updated keywords.

@DxO_Support-Team Bug Report - DxPL ‘Removes’ “wrong images”

After an appalling set of tests, all the problems were started by my own silly mistakes and I will write up the issues tomorrow. In the meantime the following bug is present in PL5.5.0 and PL6.0. and on PL6.0.1 which has just finished downloading and installing!

While conducting tests I had to keep clearing data because of my errors and the occassional DxPL “glitch” so I stopped deleting images and started renaming the directory so here is the error I encountered.

  1. My test directory contains 4 images, 2JPG and 2 RAW, 4 DOPs and 2 xmp sidecar files.

  2. I copy them to a test location and navigate to that in DxPL (AS(OFF)) but DOP load etc. set as default (ON)!.

  3. Change the name of the directory to “{name}-Renamed”

  4. Now copy the original set of images, DOPs and sidecar files to (as) the original directory so that there are two directories containing the same sets of files, “{name}-Renamed” and the newly reconstructed “{name}”

  5. Select the 4 images in the “{name}” directory and ‘Remove’ in DxPL

  6. DxPL deletes the images from “{name}-Renamed” but shows “{name}” as being empty but “{name}-Renamed” also shows as empty and is indeed empty!

  7. Only a Restart will show the true state of “{name}”, i.e. images intact but the “{name}-Renamed” images have all been deleted from the disk!!

  8. DxPL is getting its wires (database references) crossed or something like that!!!

Rename the folder in PhotoLab’s sidebar or in Windows Explorer?

PS: Make Your finding a separate bug report, it might otherwise be lost…

@platypus Sorry if I had used ‘Rename folder’ it would have made it clearer than just “Change the name…”.

It was done in DxPL to minimise the “risks” of any time lags that might have occurred when DxPL discovered a directory had vanished and needed to clean up the database as a consequence!

Instead it ties itself in knots instead!

It just rounded out a set of tests that I should have left until the following day because DxPL does not handle removing directories (as a consequence of my own mistakes) particularly well as you will see in posts when I have composed the post for two other issues that occurred along the way!?

Please make each issue report a separate thread. They will be easier to track for DxO and the forumers.

BTW: DPL (Mac) cannot rename folders.

Bryan, I must say that I have enormous difficulty following any of your “tests”. They are just way too complex since they seem to deal with more than one problem at a time - far too much to hold in the brain.

You ask about rules for avoiding problems. There is one simple rule - don’t pass and repass images and DOPs between machines unless you…

  1. enjoy pain
  2. delete the database to avoid the pain

@Joanna I am sorry if you find my tests and explanations too long and tortuous to follow I must work on that. The truth is that I find the same with some of yours and others where the writer knows and understands the problem but the reader can’t necessarily get up to speed quickly enough! But the cryptic posts don’t work either so …

But your solution is, I believe,

  1. Largely unnecessary

  2. Way too destructive, particularly to achieve harmony between a field laptop and a desktop system.

  3. My alternative procedure that I espoused some time ago should work, although the deletion bug I have just found might just “interfere” with that.

  4. The tests above worked perfectly and so I moved into testing with metadata in tow (in the containers that should be used) and attempted the tests after an afternoon chopping down vegetation and clearing up the debris with only a head torch for light! I made too many mistakes setting up the tests because I was tired and found some old bugs I have encountered before and the new “slight of hand” deletion bug.

  5. None of this is really hard, the hardest part is actually understanding what is possible when we are working in a vacuum of non-communication from DxO.

I believed the fallacy about how hard hierarchical keys are and how they should be avoided at all costs! Then I realised it was simply a formatting issue and every package has adopted its own variation in rules! If you implemented my Keyword Formatting Template design your program could be everything to everyone on the MAC, except a purist like you!?

But that is actually just another entry in the table so your package can interwork between all the other packages!?

The problem with your package and the Python KFT script I am (slowly) developing when not gardening, DIYing, testing and writing huge posts, is that they are yet another component when the technology could so easily be built into DxPL @Musashi!?

The hardest part about the tests that I am currently doing is the time it takes to set them up and capture results when the tests are over in a couple of minutes!

So cryptic version of the long post

  1. According to my tests it is possible to synchronise two copies of the database, and images and DOPs between two system running identical versions of DxPL when the metadata is passed via the DOP. So well that DxPL can be active on both systems at the same time and the edit data will happily flip flop between the systems automatically!

What I was starting to test when I made errors and then found errors was to repeat that success with “conventionally held” metadata in tow!

My rule is: Never let PL ‘see’ an image without a DOP sidecar, apart from when first importing.

This process works for me, and involves no pain and no database deletion (but I don’t manage ratings, keywords or colour labels in PL):

  1. In preferences, set PL to automatically import and export sidecars.
  2. Introduce new images to PL on one computer, and make sure it generates DOPs for all the images (a default preset other than ‘No corrections’ will do that).
  3. With PL closed on both computers, copy the images and associated DOPs (and any XMPs) from the first to the second.
  4. Open PL on the second computer and move to the folder with the new images; it will ingest them and use the same ID as it finds in the DOP files.
  5. Thereafter, the DOPs can be copied back and forth after each editing session, but never let PL be running on both computers at once.

In practice, as I mentioned earlier, I use Resilio Sync to keep the folders in step between the computers, and this works even when PL is open on one of the computers. This all works reliably for me on Macs; other sync tools on other computers may work as well - YMMV.

3 Likes

I agree, using Windows and Syncovery, with 2 machines I have never had a problem working as Aearenda describes. Indeed its 3 way using a DAM and XPMs can be updated and the resulting changes imported into PL on both PC and laptop. It can get complicated as weeding in PL needs my DAM to be updated on both machines and PL updated its self after copying the changes between them. But as PL is hopeless for lack of database management I regularly delete the data base to clear out any problems that might be create on them. The DAM is no problem it is run to verify changes and this keeps its database and thumbnails updated. But I have never hadf any problems doin this.

My testing indicates the same!

A must! If you turn off the DOP import then transfer without VCs will be impossible

I believe that a forced DOP Export on all images selected within a directory will ensure that a DOP is created regardless of whether the image has been accessed or not. Applying a ‘No Correction’ preset also seems to have created a DOP which was written in the 20 second DOP cycle!

Confirmed by my testing, including for images with embedded and sidecar xmp data! With AS(OFF) the metadata will be taken from the DOP of first discovery and can then be updated from the image with a ‘Read from image’ if deemed necessary or expediate.

But so far my tests have indicated that even leaving the same directory (not the same but the clones) open and selected is perfectly O.K. and having the directory active at the time that the copy is made from one to the other is fine!

The edit data, including the Tag is discovered and displayed immediately but the metadata from the DOP needs to be updated with an ‘Import’ and from the image needs to be updated with a ‘Read from image’ !

Beyond Compare is my software of choice and is available on Windows and Mac and perhaps a little cheaper. The main reason that I use it is that I don’t want automatic backup but I do want to know what differences there may be and to compare the contents of the files in detail!

But each to their own favourites!

You may be right but be specific please, if the features that you are referring to are the ones that @platypus and I have requested that is understandable particularly as I understand from @platypus’s comments

then the rename capability is not available on the MAC.

The DxPL database is not particularly weak and is SQLite like most of the other products, ACDSee uses a Dbase derivative, Photo Mechanic uses none (except with the Plus) version, FRV uses none.

Please be specific, what problems have you personally experienced.

I call DxO out time and time again for issues I have found and consider should be fixed and could be fixed with very little development but there is way to much generalised criticism in the forums.

The reason for this topic was to see if I could get to the bottom of what was true and untrue about using DxPL to maintain “paired” system processing" The original topic which went by the title of “Unwanted Virtual Copies” ran for a year plus others that discussed nightmare issues when trying to synchronise between machines why do these problems appear to exist what are the facts and what is the fiction!

Thanks to what @Aearenda has written and I have discovered in my testing (thus far) is that databases don’t need to be destroyed (but can be if that suits the user) and synchronicity can be successfully maintained between two systems!

However, when I went “off track” with my testing I (re-)discovered issues that I need to re-rest and report which may indicate that the database clean-up routines in DxPL can be “fooled” but I was reloading the same images again and again with the same DOPs and …

and
You may be right but be specific please, if the features that you are referring to are the ones that @platypus and I have requested that is understandable particularly as I understand from @platypus’s comments

I remember the time early PL and Optics Pro when the standard support response to almost every problem was delete the database. Indeed many times that was indeed the problem, though not always!

I haven’t had problems as I delete it, I don’t need it. BUT copying between two PC’s and at usually this includes when copying back to the laptop a largish proportion of the original image’s will have been deleted after dealing with them on good screens. On the laptop as well I only keep the last two years image’s, to keep Photo Supreme updated they are deleted via it (don’t think any way of doing so via PL). Thus the combined regular deletion of image’s and annual year deletion means the PL database would be filled with no longer existing information, I don’t need any way as I use Photo Supreme for keywords and projects. Photo Supreme has a range of tools to maintain the database and its contents. I can verify folders, years or even the whole collection as I have done a number of times when retrospectively adding keywords. I can scan my drive for image’s that are missing files or folders. Unlike PL I compact the database which also verifies it and can export with verification.

Thus on the laptop there is a constant flux of image’s that PL isn’t designed to deal with. On the desk top I don’t use anything in the data base and as said right at the beginning go back to the days when the data base was a first port of call in problems with PL and earlier.

Indeed, that often solved the issues. Recent versions of DPL have been more stable though, but the DB and photo archive tend to diverge, unless we do not move, rename or delete files outside of DPL.

I have tried to keep a healthy database for many years, but it is hard to do while testing and in view of the fact that I cannot tell DPL to create and use a different database. I’ve therefore switched to delete the DB regularly, stick to Lightroom for its robust management functionalities and DPL for is optical and noise corrections.

2 Likes

Photo Supreme both allows multiple data bases and you can have material in it that’s not directly accessible. Its not something I have used, I clear out the former years images to keep the storage down on my laptop that getting on a bit now and doesn’t have the size of storage more widely used now. But switching databases is an option easily used as I expect most proper DAM’s have…

@platypus I know you use a Mac so what works on a Win10 PC may not work on a Mac but although the commands are not directly accessible the following is possible (on a Win10 P.C. at least) and useful for testing!

  1. Navigate to an agnostic directory, possibly one added for the purpose, i.e. one without actual images in that directory
  2. ‘Create a backup’ of the current database in DxPL and close DxPL
  3. Delete or change the name of the active database!
  4. Re-open DxPL and immediately ‘Create a backup’ e.g. “PL6.0.1 - Blank DB for Testing”
  5. Restore the previous database which requires a restart.
  6. Whenever you want to test then ‘Create a Backup’ of the current DB, be that a production copy or another test copy and Restore the blank ready for a new test or a previous backup of an in progress test or …
  7. BUT nowhere in DxPL (I believe) is there any clue to the origins of the database and any browsing of directories will quickly just add those items to the database! I actually hate the import process of other products but it means that it is harder to add more and more to the database simply by the act of browsing! !?

@John7 Photo Supreme that you use and IMatch that I own are full blown Image DAMs, that is their principal or sole reason for being!

DxPL is a RAW processor and editor with a workable DAM infrastructure, which is essentially an extension of what was put in place to store editing data (with the DOPs as a portable backup which can “live” with the image wherever it goes!?)!

There is a danger of condemning one for not being on a par with the other, but DxPL does require some additional features as various forum members have posted from time to time. The most infuriating aspect is that many of these features are already available in free software and would take very little effort to add to DxPL.

They are not new headline grabbing features that will drive sales but they would inch by inch, sorry centimetre by centimetre (I typically use both when measuring for cutting wood etc.), improve the overall usability of the product for existing and future users

@BHAYT, my point is, that I want to use DPL as is, without workarounds. Today, PhotoLab on Mac provides no means to use several databases. Paths are buried in preferences files and no provisions exist to access the respective entries. That is okay for me as long as I’ll use DPL as a sidestory to Lightroom.

DxO could easily produce a “DxO PhotoLab Pro” package with decent DAM services and DB maintenance plus all the editing goodies of today’s DPL. Some paradigms would need to change though, one of them being automatic import, which is the first weak spot of the current approach imo.

2 Likes

Do you mean import of sidecar/.dop files ? … If so then I say “au contraire” … because , for me, this is a key feature of PL … in fact, the very one that led me to choose OpticsPro (as it was named pre-PhotoLab) over alternatives that depended on a “black-box” database.

John M