it's not
the parasites can't train their shitty "AI" if they don't have anything to train it on
It will however reduce the positive impact your open source contributions have on the world to 0.
I don't understand the ethical framework for this decision at all.
There's also plenty of other open source contributors in the world.
> It will however reduce the positive impact your open source contributions have on the world to 0.
And it will reduce your negative impact through helping to train AI models to 0.
The value of your open source contributions to the ecosystem is roughly proportional to the value they provide to LLM makers as training data. Any argument you could make that one is negligible would also apply to the other, and vice versa.
if true, then the parasites can remove ALL code where the license requires attribution
oh, they won't? I wonder why
If bringing fire to a species lights and warms them, but also gives the means and incentives to some members of this species to burn everything for good, you have every ethical freedom to ponder whether you contribute to this fire or not.
For your fire example, there's a difference between being Prometheus teaching humans to use fire compared to being a random villager who adds a twig to an existing campfire. I'd say the open source contributions example here is more the latter than the former.
Not if most of it is machine generated. The machine would start eating its own shit. The nutrition it gets is from human-generated content.
> I don't understand the ethical framework for this decision at all.
The question is not one of ethics but that of incentives. People producing open source are incentivized in a certain way and it is abhorrent to them when that framework is violated. There needs to be a new license that explicitly forbids use for AI training. That may encourage folks to continue to contribute.
In both cases I get the frustration - it feels horrible to see something you created be used in a way you think is harmful and wrong! - but the world would be a worse place without art or open source.
I'm not surprised that you don't understand ethics.
I couldn't care less if their code was used to train AI - in fact I'd rather it wasn't since they don't want it to be used for that.
My point was that the hypothetical of "not contributing to any open source code" to the extent that LLMs had no code to train on, would not have made as big of an impact as that person thought, since a very large majority of the internet is text, not code.
I see this as doing so at scale and thus giving up on its inherent value is most definitely throwing the baby out with the bathwater.
this is precisely the idea
add into that the rise of vibe-coding, and that should help accelerate model collapse
everyone that cares about quality of software should immediately stop contributing to open source