> What I'd assume Valve is worried about is that it only takes one major decision against Stable Diffusion in court to suddenly leave us in a state where "this game used Stable Diffusion" is the proof that's needed.
Hard to see how any plausible outcome that would have that result for users of SD (if model training isn’t fair use, that’s definitely a blanket-liability issue for Stability.AI — and Midjourney, and OpenAI, and lots of people training their own models, either from scratch or fine-tuning, using others' copyright-protected works.
But “using a tool that violates copyright in the workflow” is not itself infringement; whether and in what situations prompting SD to produce output makes the output a violation of copyright (and whose) would be a completely different decision, and while Ibcan certainly see cases (such as deliberately seeking to reproduce a particulaflr copyright-protected element, like a character, from the source data) where it might be (irrespective of the copyright status of the model itself), I haven't seen anyone propose a rule that could be applied (much less an argument that would justify it as likely) based on copyright law that gets you to “used SD, in violation”.
Lots of blanket ethical arguments about using it, but that’s a different domain than law.