-
insights-mcp Public
Forked from RedHatInsights/insights-mcpAn MCP server to connect insights functionality to an AI / LLM
Python Apache License 2.0 UpdatedNov 4, 2025 -
ramalama Public
Forked from containers/ramalamaRamalama is an open-source tool that simplifies the local use and serving of AI models for inference from any source through the familiar approach of containers.
Python MIT License UpdatedMay 1, 2025 -
text-generation-inference Public
Forked from huggingface/text-generation-inferenceLarge Language Model Text Generation Inference
Python Apache License 2.0 UpdatedApr 30, 2025 -
llama.cpp Public
Forked from ggml-org/llama.cppLLM inference in C/C++
C++ MIT License UpdatedApr 30, 2025 -
vllm Public
Forked from vllm-project/vllmA high-throughput and memory-efficient inference and serving engine for LLMs
Python Apache License 2.0 UpdatedApr 29, 2025 -
ramalama.github.io Public
Forked from containers/ramalama.github.ioRamaLama web site.
JavaScript UpdatedApr 24, 2025 -
image-builder-frontend Public
Forked from osbuild/image-builder-frontendImage Builder service for console.redhat.com
TypeScript Apache License 2.0 UpdatedMar 4, 2025 -
quickstarts Public
Forked from RedHatInsights/quickstartsBackend service for integrated quickstarts.
Go MIT License UpdatedApr 24, 2024