Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
tom_0
2mo ago
0 comments
Share
Oh, I can't believe I missed that! That makes whisper.cpp and llama.cpp valid options if the user has Nvidia, thanks.
0 comments
default
newest
oldest
lostmsu
2mo ago
Whisper.cpp and llama.cpp also work with Vulkan.
tom_0
OP
2mo ago
Yeah, I researched this and I absolutely missed this whole part. To my defense I looked into this in 2023 which is ages ago :) Looks like local models are getting much more mature.
j
/
k
navigate · click thread line to collapse