LlamaResearcher is your friendly research companion built on top of Llama 4, powered by Groq, LinkUp, LlamaIndex, Gradio, FastAPI and Redis.
Required: Docker and docker compose
The first step, common to both the Docker and the source code setup approaches, is to clone the repository and access it:
git clone https://github.com/AstraBert/llama-4-researcher.git
cd llama-4-researcherOnce there, you can follow this approach
- Add the
groq_api_key,internal_api_keyandlinkup_api_keyvariable in the.env.examplefile and modify the name of the file to.env. Get these keys:- On Groq Console
- On Linkup Dashboard
- You can create your own internal key
mv .env.example .env- Or do it manually:
docker compose up -f compose.local.yaml llama_redis -d
docker compose up -f compose.local.yaml llama_app -dYou will see the application running on http://localhost:8000 and you will be able to use it. Depending on your connection and on your hardware, the set up might take some time (up to 15 mins to set up) - but this is only for the first time your run it!
- Redis is used for API rate limiting control
- Your request is first deemed safe or not by a guardi model,
llama-3-8b-guardprovided by Groq - If the prompt is safe, we proceed by routing it to the ResearcherAgent, which is a function calling agent
- The ResearcherAgent first expands the query into three sub-queries, that will be used for web search
- The web is deep-searched for every sub-query with LinkUp
- The information retrieved from the web is evaluated for relevancy against the original user prompt
- Once the agent gathered all the information, it writes the final essay and it returns it to the user
Contributions are always welcome! Follow the contributions guidelines reported here.
The software is provided under MIT license.