Laminar - open-source observability platform purpose-built for AI agents. YC S24.
-
Updated
Feb 7, 2026 - TypeScript
Laminar - open-source observability platform purpose-built for AI agents. YC S24.
Templates and workflow for generating PRDs, Tech Designs, and MVP and more using LLMs for AI IDEs
🔥 🔥 Alternative to JSON 🔥 🔥
FlowLLM: Simplifying LLM-based HTTP/MCP Service Development
Supercharge qwen-code with hybrid prompt-chaining, powered by gemini-cli integration. Performance improvements include 72% faster execution, 36-83% token efficiency gains, and 91.7% success rate verified across 5 repository benchmarks.
LLM Framework for LLMs
A battle-tested framework for AI-assisted development. Session handoffs, knowledge preservation, and structured workflows. Zero dependencies.
Agent-style LLM orchestration framework for composable AI workflows
Force any OpenAI-compatible tool (Aider, Fabric, Interpreter) to use Gemini, Groq, Cerebras, or Ollama. Pure Bash. Zero dependencies.
CLI tool for LLM prompt pipelines. Reusable. Shareable. Scriptable.
Miako (short for Mirai Aiko) is an agentic, decision-based multi-step LLM workflow that handle user query regardless of language, modern slang and gen-z day-to-day terms/word. This llm workflow can be done with cheap/affordable models such as GROQ model. I made this to be a production ready open-source mvp backend. Freely yours to use this.
使用 GPT 建立自訂角色、分工生成與合成審稿流程的範例。Example of GPT-driven role-based prompt orchestration and synthesis workflow.
Meta Prompting Directive (MPD): a system-level workflow for AI-generated prompts and AI-executed tasks
Calibrate LLM responses for high-agency power users, prioritizing rigorous analysis and executive control. Overrides the default, RLHF-driven tendency toward immediate, ungrounded solutions that serve lower-agency 'LLM as magic tool' workflows.
An open-source Node.js RESTful backend API server designed to manage and execute complex workflows with AI and human-in-the-loop capabilities.
Sovereign Context Protocol - A human-first workflow for managing LLM memory. Not an MCP server.
Project structure and workflow for AI-assisted software development.
Add a description, image, and links to the llm-workflow topic page so that developers can more easily learn about it.
To associate your repository with the llm-workflow topic, visit your repo's landing page and select "manage topics."