Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
0 points
crystaln
1mo ago
0 comments
Share
Seems much more likely the cost will go down 99%. With open source models and architectural innovations, something like Claude will run on a local machine for free.
undefined | Better HN
0 comments
default
newest
oldest
walterbell
1mo ago
How much RAM and SSD will be needed by future local inference, to be competitive with present cloud inference?
j
/
k
navigate · click thread line to collapse