💻
Focus
Pinned Loading
-
-
llm-chat-service
llm-chat-service PublicA lightweight, concurrent Go web service powering a single-conversation chat interface with Groq Cloud LLMs. Supported by Server-Sent Events (SSE) for real-time token streaming.
Go
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.