Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
magic_hamster
2y ago
0 comments
Share
Macs don't support CUDA which means all that wonderful hardware will be useless when trying to do anything with AI for at least a few years. There's Metal but it has its own set of problems, biggest one being it isn't a drop in CUDA replacement.
0 comments
default
newest
oldest
adam_arthur
2y ago
You can do LLM inference without CUDA just fine. Download Ollama and see for yourself
doublepg23
2y ago
I'm assuming this won't support CUDA either?
j
/
k
navigate · click thread line to collapse