I think the only principle positions to be had here is to either ignore IP rights for LLM training, or give up entirely, because a model designed to be general like human will need to be trained like a human, i.e. immersed in the same reality as we are, same culture, most of which is shackled by IP claims - and then, obviously, by definition, as it gets better it gets more competitive with humans on everything humans do.
You can produce a complaint that "copyrighted X was used in training a model that now can compete with humans on producing X" for arbitrary value of X. You can even produce a complaint about "copyrighted X used in training model that now outcompetes us in producing Y", for arbitrary X and Y that are not even related together, and it will still be true. Such is a nature of a general-purpose ML model.
IP rights, or even IP itself as a concept, isn’t fundamental to existence nor the default state of nature. They are contigent concepts, contigent on many factors.
e.g. It has to be actively, continuously, maintained as time advances. There could be disagreements on how often, such as per annum, per case, per WIPO meeting, etc…
But if no such activity occurs over a very long time, say a century, then any claims to any IP will likely, by default, be extinguished.
So nobody needs to do anything for it all to become irrelevant. That will automatically occur given enough time…
That argument is equally absurd as one where you write a program that does the same thing. Model training is not only fair use, but publishers should be grateful someone has done something of value for humanity with their collected drivelings.
Many artists don't like how image generators, trained on their original work, allow others to replicate their (formerly) distinctive style, almost instantly, for pennies.
Many authors don't like how language models can enable anyone to effortlessly create a paraphrased versions of the author's books. Plagiarism as a service.
Human artists and writers can (and do) do the same thing, but the smaller scale, slower speed, and higher cost reduces the economic effects.