LocalAI always shows "Fails: could not load model: rpc error" #1755
Unanswered
philipp-fischer
asked this question in
Q&A
Replies: 2 comments 6 replies
-
@mudler, any idea what's going wrong here? I'm sure the model is in the right |
Beta Was this translation helpful? Give feedback.
0 replies
-
@philipp-fischer can you share the logs with the |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I built localai locally (not as docker), since I'm on an ARM architecture with CUDA.
I am able to run the
./localai
binary and talk to the server, however as soon as I want to run inference with a model, the server doesn't seem to find a backend that works:Any ideas?
Beta Was this translation helpful? Give feedback.
All reactions