Autonomous research agents for Claude Code - passively research and inject helpful context
-
Updated
Dec 21, 2025 - TypeScript
Autonomous research agents for Claude Code - passively research and inject helpful context
Leveraged the power of Google Cloud's Vertex AI platform to develop advanced Large Language Models (LLMs). Utilizing the Python API provided by Google Cloud, this endeavor represents a significant stride in the realm of natural language processing and LLMs.
A FastAPI gateway for local LLMs that adds intelligent web research, multilingual recency/how-to detection, time-anchored guidance, context injection, and OpenAI-compatible SSE streaming. Turn any local model into a recency-aware, context-enhanced assistant instantly.
This is a RESTful API testing Framework using C#, .NET Core, xUnit, Specflow with Context Injection, Flurl and Fluent Assertions to test JSONPlaceholder REST API.
An API gateway that connects user queries with RAG-based document search and LLM-powered natural language answers. Built for document Q&A and knowledge workflows.
High-performance Node.js framework with automatic parameter injection, eliminating route handler boilerplate. Features middleware chaining, built-in HTTP client, plugin system, and CLI with
Demo of context injection by an AI Agent to creator browser based authorization flows during browser hijacking session by AI
Fine-tunes a T5-small model on the TellMeWhy dataset using context injection from a large language model (Gemini) to improve causal reasoning for “why” questions in narratives. Combines efficient training with human and automated evaluations to assess impact.
DOM sensor and context injection middleware for AI web interfaces
Smart context injection tool for Claude Code CLI - automatically includes project-specific context, timestamps, and dynamic variables in every prompt
Context stuffing for LLMs
Add a description, image, and links to the context-injection topic page so that developers can more easily learn about it.
To associate your repository with the context-injection topic, visit your repo's landing page and select "manage topics."