Fair enough, but what exactly do you want to do? Require an indication like “x% of this book was generated by AI and not competently proof-read”? And how would you enforce it? I have a hard time envisioning a practical way to tackle this. The machine-gun analogy doesn’t quite carry over here.
There're serious lobbying efforts by industry incumbents and authoritarians to actually apply arms-like restrictions for access to AI. You'd need a special license to be able to create/run your own powerful models legally. I presume that's what they're getting at.