I am all for local models, but this is massively overselling what they are capable of on common consumer hardware (32GB RAM).
If you are interested in what your hardware can pull off, find the top-ranking ~30b models on lmarena.ai and initiate a direct chat with them on the same site. Pose your common questions and see if they are answered to your satisfaction.