Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
story
0 points
mayankchhabra
2y ago
0 comments
Share
It's fairly straightforward to add GPU support when running on the host, but LlamaGPT runs inside a Docker container, and that's where it gets a bit challenging.
0 comments
default
newest
oldest
stavros
2y ago
It shouldn't, nVidia provides a CUDA Docker plugin that lets you expose your GPU to the container, and it works quite well.
dicriseg
2y ago
See above if you're interested in that. It does work quite well, even with nested virtualization (WSL2).
stavros
2y ago
I am, thanks!
j
/
k
navigate · click thread line to collapse