No one who has been using any model for just the past 30 minutes would say that it has "pretty much replaced Google/SO" for them, unless they were being facetious.
The instruct version of code llama could certainly be run locally without trouble, and that’s interesting too, but I keep wanting to test out a local CoPilot alternative that uses these nice, new completion models.