A problem I see is that we're now being provided with AI-driven processing tools that are beginning to blur the line between photography and AI-generated imagery. How much processing would one accept before saying, "This is not a real photograph."? Of course, in the world of commercial photography, many wouldn't care; the goal is to have a "perfect" image. But for the rest of us, the question will become inevitable.
I don't pay a lot of attention to these tools, as I don't use them. But I do get the frequent ads for many of them (one of the most frequent is Topaz Labs). They describe in glowing detail what can be done to alter a photograph, and show examples. Frankly, I'm horrified, and my personal take is that the result often is not really a photograph anymore. Of course, that's my take, and to put it into perspective, back when I shot digital I would tie myself into philosophical and ethical knots about cloning out sensor dust. That's taking things to an extreme, but it's a slippery slope...
The bottom line in all art is the provenance of the art object. Cleaning dirt (and other 'noise') out of an image in rendering it to enhance the capture has been done since the beginning of Photography, so I see little issue with that. Photographic compositing (merging photographic images into a coordinated whole in as seamless a way as possible) is also a perfectly acceptable photographic endeavor ... see the work of Jerry Uelsman, the master of this kind of photographic art.
The key is that all of these forms of art include provenance that testifies to what the image(s) are about (or "intent by the artist" to use another description), when they were captured, and how they were rendered. You can't do that if your imaging process is "tell the AI software to generate a smiling woman with a baby in her arms that looks like Aunt Darlene" ... Provenance in that case can say what the instructions were, what AI tool was used, who dreamed up the idea for the image, etc, but it CANNOT say that the image is a photograph. It's computer generated art, that's all.
If you have AI tools that can take your photograph, isolate the key subject you wanted, and then composite it into a lunar landscape setting ... Well, I can accept that the provenance in that case can articulate some of the inclusions to the art work, but the art piece is still not a Photograph ... it's a painting which includes a photographic element.
That's not what I produce or am interested in producing, but I've seen stuff like that which is pretty nicely done.
The important thing is that the provenance of whatever art piece or photograph you make is clear and honest. If it isn't then the piece is not credible, whether or not it might be considered a photograph or computer graphics.
Commercial art, whether photographs or CGA ... the intent is always to display or promote something, usually for profit. So the provenance of an image used for this kind of purpose is rather simplistic. Much more critical is the provenance of images used for documentarian and forensic purposes. Nothing "generated" by AI can be truly documentary in nature, it can at best be an artist's simulation/recreation.
G