PL5 keywords, one step closer, many steps still to go

Hi uncoy - Thanks very much for giving your very thorough explanations of how you arrived at PM and why you are so pleased with it. It’s unfortunate that Imatch is Windows only so you did not have the opportunity to test it.

I know that there are experienced Imatch users here. It might be helpful if some of them could post about Imatch. At this point all I seem to hear about in this Forum is PM and it would be beneficial to others who are considering purchasing a DAM to hear something about it from PL users, pro or con.

Also uncoy thank you for your last sentence in the post. :slight_smile:

1 Like

I used iMatch about 20 years ago until I started using Lightroom 1 and stopped using iMatch. iMatch was and still is very powerful with tons of functionality. What I really liked about it was it had a built-in scripting facility which I used quite a lot.

When I ditched LR in favour of PL I needed a method of copying photos from my camera cards to my hard drive similar to what I did with LR. iMatch could not do this so I found PM which was very good at managing photos and in a very flexible way. I then I noticed PM+ had a beta program so I got involved and have used it ever since. I am very happy with PM+ and PL working well together.

I narrowed my search down to Photo Supreme and Photo Mechanic Plus.
The criteria being the ability to compare two images and easy keyword application.
And, of course, speed.
There was something about PS that made simple keyboarding complicated, so I chose PM Plus.
PM Plus may seem complicated at first to the occasional user but you can decide what you want out of it and ignore the other features.

I agree. Also because of the basic image adjustments such as white balance, exposure etc.
What I find a bit awkward about the FRV image compare setup is the method it provides to paste adjustments from one image to another.

BTW it would be nice if PM Plus recognized .afphoto files.

Joakim, now you are talking! Aperture history since four years and still it is discussed by some as an alternative to modern XMP-compliant tools. It was at least so I received the presentation of Aperture you earlier gave us. My only interest in this is trying to make people to reflect on the fact that tools that don´t adapt to open and well established standards like XMP or ITPC today and locks you into propriatary prisons are potentially disaoustrous for the ones cought up in them too long. The risk is that all of your work gets wasted since all migration paths may well be closed since a long time or never was there.

Don´t you think people has a right to know about that? Why do you want a discussion like that to “shut up”?

1 Like

How could I have known that you, the former seller of spy-ware (if I remember correctly?) is so unaware of a DAM which actually made a lot of things easier than your patchwork of software costing 3 × more than Aperture used to and being 3 × less easy to use? Ok, not easier for you, on the windows side of this the OS can be a limitation. Since when is Windows able to display RAW files? Not that long. Since when can you search in Windows 10 for items on your harddrive as a simple basic full text search? Both abilities of the OS made Aperture easier to develop. When the whole OS, be it Finder, Mail, web-browser cam benefit of intelligent folders, no one needs keywords to find files.

No one (except in your imagination) is discussing a dead app development as the alternative of modern (that one gave me a good laugh) XMP-compliant tools.

Aperture is just the scale I judge other wannabe-DAMs in terms of easiness to use. And DxO + PM plus or minus is not getting the golden pot in this aspect.

Hello Folks, friends, forum members, staff and anybody else

I have the feeling that more and more personal hostilities, buzzwords and not factual language creeps into the forum.
Can’t we just get back to factual discussion, exchange of interests, and linguistic fun without becoming irrelevant.

With all the different approaches, opinions and areas of interest, we can still exchange civilized and let some harmony prevail.

A nice pre-Christmas time wishes

Guenter

8 Likes

Fully agree

1 Like

I have never sold any Spy-ware by the way. XMP Spy is a company selling tools developers use to build XML-based systems for “Business to Business” communication or dataexchange in loosely coupled systems on the Internet. You can check the tools they make with the link below. XML Spy is more or less one of the standard tools developers of XML-systems have used for decades now.

XML Schema Editor (XSD Editor) (liquid-technologies.com)

XMP and IPTC-metadata in sidecars to RAW or inside of XMP-compliant files are not primarily there to make searching with a local PC or Mac more simple. It´s a general standardized way to distribute metadata attached to the images floating around the whole globe on the Internet. With RAW we use sidecars and these are used mainly locally in RAW-converters as you know but on the net we need preferably JPEGS with the metadata written into an XMP-header to secure that the metadata never gets separated from the image files. In company DAM-systems sidecars are used tied to many other types of documents than RAW-files but they are not intended to be used outside these companies normally. That´s why the PDF-fileformats are used because normally they are XMP-compatible and have XMP-headers too.

On the Internet there is no fulltext search in a local computer to rely on here Google have indexed among other data the metadata in the images instead and in the companies working seriosly with DAM Tech they index the images and documents by indexing them starting with the topfolders in their folder trees. In big organisations and companies it is the XMP-metadata that holds everything together it feeds both these organisations and the Internet with data to search.

This data has to be limitless and not locked into some proprietary systems. That doesn´t fly at all if we want to communicate with the world outside our own limited local personal computer world and today a product like PM Plus is maybe the product made for stand alone use that closest resembles how Enterprise DAM are built and work. As far as we can see XMP will be future proof and if changes are made they are usually made so dataelements that has been replaced by others still are kept in the schemas and the tools of compatibility reasons.

1 Like

one thing xmp has is lots of entry’s of different applications reading or writing on a different way. so it gots messy after a wile. even double entry’s in a different way are seen.
long live “standards” :thinking: :crazy_face:
a clean house function would be usefull or not?

1 Like

Then it’s clear that I know the term “spy-tools” as well as you know Aperture :grin: sorry for that confusion.

Apparently not, you keep stirring it up like all of us participants in this epic long thread. :smirk: and we don’t get even paid for it. :flushed: that’s scandalous, isn’t it?

Now, what I still don’t get: why does one need to use at least two different apps at rather high costs and also given inconvenience? What is the benefit for the user of paying updates to at least two different companies which all have their own roadmap when to upgrade and what to include? I’m not asking to tease you (okay, if it would be teasing you, it won’t give me much nightmares…), I just found this the best part of Aperture - not the editing, cataloging, publishing part, but all of them in one single package altogether. No worries about incompatibilities of metadata exchange.

Today the very weird situation exists, that one could go from Lightroom to PL by “exporting” the RAW first to a DNG, then open and editing it in PL and exporting it back by the same way of exporting another DNG back to LR. DNG is the world’s fattest RAW file format I know of, so I assume the “inventors” of this way already bought some stock shares from WD or Samsung or whoever will deliver the disk space.

I do like old libraries, the smell and the quietness. As a visitor I do enjoy visiting. But finding books with external systems not able to speak to each other can become a frustrating experience (“should be in this shelf but is borrowed”). Especially because books (and images as well) do contain so much more properties and special twists no keyword can really tell. I do understand the purpose of keywords, I just don’t understand the need of multiple apps.

Standards are a mere recommendation, I had to learn. And only describing (at best) what was technically possible at the time of the standard’s description. This is why I don’t think, keywords alone will do the job. A google image search is not based on keywords, they use other ways to find similar images. But keywords will come into play if I don’t have a sample picture to put into their search. I just don’t take images for other people to find them.

The other thing is, not everyone needs to “communicate images with the world” - we already produce more images than we can review, revisit in a lifetime. So, to me it appears as if Stenis is slightly overrating the meaning and value of keywords, especially as we’re not living in a keyworded world. In few forests there are plates with the latin names of the trees and bushes…

Okay, it’s great to find yoghurt in the dairies department. And not in household cleaning. But else than that it doesn’t help much to get more keywords to find a good yoghurt. And it’s also great I don’t have to research if the “signmaker” responsible for the “dairies” sign is maybe about to deliver some “mango” “sugar-free”, “low-carb”, “no plastic” signs.

Linear DNG such as PL creates is essentially a 48-bit TIFF file, which it is very nearly identically sized to such a TIFF file. It’s technically not a RAW file as it does not contain raw sensor data. So yes, any true RAW data, for a given frame size, is going to be substantially smaller.

As the starter of this thread, I think I can safely say it has gone way beyond what I was trying to convey, and that is that for those of us who do care about a strong keywording experience, PL is not yet a solid offering, thought certainly PL5 made a significant step forward.

Aside from any standards, or implementations, or workflows, it remains a fact that I want to put detailed information on a significant subset of my photos, as do others. The only practical way I have found to achieve this efficiently is to use a system devised (or at least perfected) by Adobe. If it wasn’t as efficient as they make it in Lightroom, I am not sure how fastidious I would be, and maybe some would say I am therefore not as dedicated to the task as I make out. In reality, I became this fastidious because I was enabled by Lightroom many years ago and I thought “if I can do this, I should.”

Furthermore, while I call my needs “complex” in fact that is only a relative term. Some may only want to put “tree” while others would rather a full taxonomy of the tree. In reality, the Lightroom keyword system is structurally very simple — it is from the execution of workflow that it draws its incredible strength. I have previously given examples where I can attach something like a dozen keywords with as many keystrokes, from a possible list that is enormous, but structurally it is literally a simple tree with a few attributes on each node.

So finding yoghurt in the supermarket is a fine analogy to finding a photo with a “tree”. But historians, researchers, and enthusiasts — and I count myself as all three of these — will always want something more. I might draw a faint analogy to the ever more popular pastime of genealogy. Very few people stop at simply finding and linking up names. They add dates — births, marriages, children, death — and places and occupations and much, much more.

So I return to the title of this thread. PL5 is one step closer, many steps still to go for people who really use keywords.

2 Likes