This is the crux of the issue for me. It's a different set of rules for AI companies than everyone else. If I started selling pirated copies of Nintendo games they would send an army of lawyers after me and this "opt-out" reasoning would not be a valid defense in court. These AI companies are trying to get away with stealing art and other content with a simple "whoopsie, we promise we won't do it again" when people demand that their own rights be respected.
It's not copying, it's breaking down work to it's foundational features and recombining those features with others to make new things. Literally exactly what humans do when they make art.
If a person was doing what these models are doing it not only wouldn't be illegal, it would be laughable if we even had the discussion.
Current trajectory will only harm original creators, there is zero consideration or benefit for them. On the other side you have companies that stand to make billions off of their work.
Arguing in favor of such a system is, in my mind, appalling. Either ensure that copyright law prohibits unauthorized machine learning from valuable original art or abolish it completely.
First, whether particular features like watermarks end up in the end product or not is not terribly relevant. The particular model implementation that produces that behavior doesn't understand that watermarks are not a desirable part of the transformed end products, whereas humans do. In that way, the AI is certainly a little more 'honest' about what it's doing. It would be trivial to make the AI understand this, and stop reproducing watermarks.
You claim that the model performing this work is somehow different because AI doesn't 'think' about problems, as evidenced by strange artifacts, whereas presumably humans do think about them because of the absence of these artifacts. I'm curious why this makes a difference. If a model was sufficiently advanced that you were unable to differentiate a painting made by a human along certain themes and an AI along certain themes would it be somehow less objectionable? Why or why not? If the end product is of identical quality why should it matter the route you take to get there, in the eyes of the law?
You bring up scale, but scale is also not relevant. Say that I create a school devoted to training legions of artists to produce paintings in the style of a particular artist, while maintaining a transformative aspect. Is this illegal because I'm doing it at scale? No. If that's not illegal, why is it illegal to do it with code? Because it's more efficient? In what other domain is producing creative work more efficiently by transforming existing work illegal?
How's that different from a human artist shackled to a company by some secret agreements in which he's not the stronger side?
It doesn't matter what the law says or what is "right." It comes down to who has the power and who doesn't.
(The other difference between what humans do and what AIs do is a matter of scale. A human imitates by spending many hours to duplicate a work of art. An AI can churn out millions in a second. That's a separate issue, though.)
No, it is transformative just like a computer would. A person would be inspired and create some kind of unique work. A prompt would - given identical start-up conditions - result in the exact same image over and over again no matter who entered the prompt.
Even a trained monkey giving the prompt would result in the same output, as would 'thumper' or the closest local equivalent.