Enabling CUDA on Docker #574
Unanswered
franco-giordano
asked this question in
Q&A
Replies: 2 comments 2 replies
-
|
Hey @franco-giordano, thanks for the detailed report:
|
Beta Was this translation helpful? Give feedback.
2 replies
-
|
Can you share your |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I'm having trouble enabling CUDA-powered inference on my machine. What I've tried:
Loaded chat model to CPU. utils.py:32in logs, CPU usage spikes)Any ideas? From previous discussions, issues, and docs I get the impression that it should work, but I'm not sure. Some specs:
nvidia-smioutput:Beta Was this translation helpful? Give feedback.
All reactions