Pretty much. Also there's no way of "securing" LLMs without destroying the quality that makes them interesting and useful in the first place.
I'm putting "securing" in scare quotes because IMO it's fool's errand to even try - LLMs are fundamentally not securable like regular, narrow-purpose software, and should not be treated as such.