In
@Chriscrawfordphoto defense, I do think he has a point. It's kind of nutty to have five figures worth of equipment, travel all over the place to take pix, and then grump about $120 a year for editing.
HOWEVER, the cost was never really my objection. My objection is having to share my images with a nameless, faceless, unaccountable entity that uses AI to harvest all manner of metadata off your impages, at least in principle.
Do you have picture of people eating, drinking, smoking, shooting (hopefully not all at once) from your last hunting trip? That AI may someday soon be able to collaborate with the AI rules being used by medical insurers.
Do your pictures express strong political or social views? Uh oh, we can't let that go on without strict oversight by the "right" people. (Both Hillary Clinton and Donald Trump in the US were victims of this so I am not being partisan in any way here).
Do you have - GASP -
nudity in your pictures??? Remember the photographer that took innocent and charming pictures of their kids running around sans diaper and got attacked for being an exploiter of children.
etc. etc. etc.
You think I'm paranoid? About half my work is in cybersecurity, threat interdiction, and threat detection. The other half involves other modern technology design work. Both of these areas are making heavy use of AI, for both good- and malicious purposes. The problem isn't the AIs themselves - they are about as important as which film reels you prefer. No, the problem is
the organizations that use them - unaccountable, mostly, and not transparent, mostly. If you really want to see how bad this can be, go read up on MK Ultra.
I trust nothing that gets too big - big government, big education, big church, big business or big finance. When something gets sufficiently large, it will lose sight of its core mission and instead work primarily to protect its existence.
And THAT, boys and girls, is why I don't like putting personally identifiable content out on some megacorp server.
I have to go polish my tinfoil hat now ...