Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
2ndorderthought
3d ago
0 comments
Share
I see your updated post. Switch over to llamacpp and look up recommended quants and settings. A good place for this info is on /r/localllama
0 comments
default
newest
oldest
gchamonlive
3d ago
Yep! I'm currently trying vllm, then I'll give llamacpp a try too
j
/
k
navigate · click thread line to collapse