A modular and comprehensive solution to deploy a Multi-LLM and Multi-RAG powered chatbot (Amazon Bedrock, Anthropic, HuggingFace, OpenAI, Meta, AI21, Cohere, Mistral) using AWS CDK on AWS
-
Updated
Mar 6, 2026 - TypeScript
A modular and comprehensive solution to deploy a Multi-LLM and Multi-RAG powered chatbot (Amazon Bedrock, Anthropic, HuggingFace, OpenAI, Meta, AI21, Cohere, Mistral) using AWS CDK on AWS
This repository features three demos that can be effortlessly integrated into your AWS environment. They serve as a practical guide to leveraging AWS services for crafting a sophisticated Large Language Model (LLM) Generative AI, geared towards creating a responsive Question and Answer Bot and localizing content generation.
A friendly guide to AWS Cloud fundamentals, with clear explanations, visuals, and practical examples for all audiences.
Your personal assistant at work
It shows a question/answering chatbot using Amazon Bedrock with RAG based on Amazon Kendra.
This app is a RAG (Retrieval Augmented Generation) chatbot that uses Amazon Q and Slack as it interface. It also provides a CloudFront links whenever it provides a source.
An Amazon Kendra REST API CDK example with an API Gateway, including authentication with AWS Cognito and AWS X-Ray Tracing
BedrockChat acts as a conversational interface, leveraging generative AI models fine-tuned on your content.
Use Python to call aws related services in lambda.
A Demo of Retrieval Augmented Generation with Amazon Titan, Bedrock, Kendra, and LangChain
Add a description, image, and links to the kendra topic page so that developers can more easily learn about it.
To associate your repository with the kendra topic, visit your repo's landing page and select "manage topics."