A NestJS-based backend service for generating quiz questions and evaluating responses using AI models.
Quizz Backend is a powerful API service that leverages AI to:
- Generate customized quiz questions on any topic
- Evaluate responses to questions, providing both good and bad example answers
- Deliver results through a clean, well-documented REST API
The service uses OpenAI models to generate high-quality questions and responses, making it suitable for educational applications, learning platforms, or any system requiring dynamic quiz generation.
- AI-Powered Question Generation: Create questions on any topic with customizable difficulty
- Response Evaluation: Submit responses and get evaluations of their correctness
- OpenAPI Documentation: Comprehensive API documentation with Swagger UI
- Containerized Deployment: Easy deployment with Docker
- Environment Configuration: Flexible configuration through environment variables
- Kubernetes Deployment: Deploy to Kubernetes using Helm charts
- Framework: NestJS
- Language: TypeScript
- AI Integration: OpenAI API via AI SDK
- API Documentation: Swagger/OpenAPI
- Containerization: Docker
- Orchestration: Kubernetes with Helm
POST /questions: Create a new set of questions based on a provided topic and difficulty
POST /responses: Submit a response for a given question and evaluate its correctness
- Node.js (LTS version recommended)
- Yarn package manager
- OpenAI API key
Create a .env file in the root directory with the following variables:
# OpenAI Configuration for Question Generation
OPENAI_QUESTION_API_KEY=your_openai_api_key
OPENAI_QUESTION_BASE_URL=https://api.openai.com/v1
OPENAI_QUESTION_MODEL=gpt-4
# OpenAI Configuration for Response Evaluation
OPENAI_RESPONSE_API_KEY=your_openai_api_key
OPENAI_RESPONSE_BASE_URL=https://api.openai.com/v1
OPENAI_RESPONSE_MODEL=gpt-4
# Server Configuration
PORT=3000
# Install dependencies
yarn install
# Start development server
yarn start:dev
# Build for production
yarn build
# Start production server
yarn start:prod# Build the Docker image
docker build -t quiz-backend .
# Run the container
docker run -p 3000:3000 --env-file ./.env quiz-backendThe application can be deployed to a Kubernetes cluster using the provided Helm chart.
- Kubernetes 1.19+
- Helm 3.2.0+
# Update dependencies
helm dependency update ./helm/quiz-backend
# Install the chart
helm install quiz-backend ./helm/quiz-backendhelm delete quiz-backendSee the Helm chart README for detailed configuration options.
Once the server is running, you can access the Swagger UI documentation at:
The OpenAPI specification is available at:
yarn build: Build the applicationyarn format: Format code using Prettieryarn start: Start the applicationyarn start:dev: Start the application in watch modeyarn start:debug: Start the application in debug modeyarn start:prod: Start the production buildyarn lint: Lint the codeyarn test: Run testsyarn test:watch: Run tests in watch modeyarn test:cov: Run tests with coverageyarn test:debug: Debug testsyarn test:e2e: Run end-to-end tests
- The API endpoints utilize AI models and may take some time to respond
- The responses from the AI models are not deterministic
- It is recommended to save generated content if you need to refer to it later
This project is licensed under the UNLICENSED license.