Ultimately, the working parts of a given model are completely unknowable to even the smartest humans once you get to doing anything past bare basics. We know the shape of the model, the number of layers, and what inputs/outputs correlate to, but not really anything else. It's the product of a machine trying things randomly until something works, then the best model produced is selected for production.
Not altogether different on a high level perspective from generating an image, or piece of text using a model. You're introducing a random factor, number of steps, and the machine uses this unknowable model to produce something a person can understand.
I do think the law should update and grant some protections to people who produce models, because losing all protection would mean the death of open model releases, and then we'd be even more seriously staring down the barrel of corpos controlling the entirety of the technology moreso than we are now. At least open models provide some semblance of control for end users.