No, I would argue that from the three main ingredients - training data, model source code and weights - weights are the furthest away from something akin to source code.
They're more like obfuscated binaries. When it comes to fine-tuning only however things shift a little bit, yes.