Hardware requirements for perplexica #348
Replies: 4 comments
-
RAM limits the models you can use, you need some sort of CUDA supported Display Card. |
Beta Was this translation helpful? Give feedback.
-
I am asking myself the same question. I suggested Perplexia as possible package for a selfhosting distribution (yunohost)- and the question occured there also. |
Beta Was this translation helpful? Give feedback.
-
You can run Perplexica on a Mac ARM with 16 integrated RAM. You can go for an Intel package if you have 8-16 RAM and a 8 RAM videocard suported for the CUDA drivers (Nvidia).RAM limits de model size, you can run good 5-6 Gbyte modern models on this configuration. It's designed to run on-site and it's tricky to get it runnning in a external server. You can run Ollama externally or through OpenAi Grog or Anthropic API. and then you have less limitations. |
Beta Was this translation helpful? Give feedback.
-
Hi, first of all - i am really happy to have found Perplexica. Real stunning what can be done locally on my machine. :) I am running Perplexica on my Asus Z13 Flow gaming tablett with 16GB Ram and 4GB VRAM on the RTX350Ti. I can run the Mistral-Nemo 12B Modell. You have to wait a bit for the answers - but it works. :) |
Beta Was this translation helpful? Give feedback.
-
Hello, which are the hardware requirements to run perplexica?
For example windows 10 with 8gb ram and i7 core could be enough?
Beta Was this translation helpful? Give feedback.
All reactions