A Spring Boot web service that provides a REST API for interacting with AI language models through Ollama and Model Context Protocol (MCP) tools.
This application serves as a bridge between web clients and AI capabilities, combining:
- Ollama AI Chat Model: Local AI model integration using Llama 3.1 8B
- MCP (Model Context Protocol) Integration: Tool execution through MCP servers
- REST API: HTTP endpoints for chat interactions and system monitoring
- Simple Chat: Direct conversation with the AI model
- Conversational Chat: Maintains conversation history with unique session IDs
- System Prompt Chat: Custom system instructions for specialized responses
- Smart Tool Detection: Automatically determines when to use MCP tools based on user input
- Dynamic Tool Loading: Automatically discovers and loads MCP server tools
- GitHub Tools: Integrated MCP server for GitHub operations
- Builder Tools: Integrated MCP server for build operations
- Tool Status Monitoring: Real-time status of connected MCP servers
- Health Checks: Monitor Ollama and MCP server connectivity
- Tool Discovery: List available tools from all connected MCP servers
- Server Status: Track connection status of individual MCP servers
- Actuator Endpoints: Spring Boot management endpoints for monitoring
POST /api/chat- Simple chat interactionPOST /api/chat/conversation- Chat with conversation historyPOST /api/chat/system- Chat with custom system promptDELETE /api/chat/conversation/{id}- Clear conversation history
GET /api/health- Application and service health statusGET /api/info- Service information and configurationGET /api/tools- List available MCP toolsGET /api/mcp/status- MCP server connection status
The application is configured via application.yaml:
- Model: Llama 3.1 8B
- Base URL: http://localhost:11434
- Timeout: 300 seconds
- GitHub Server:
./servers/mcp-server-github-0.0.1-SNAPSHOT.jar - Builder Server:
./servers/mcp-server-builder-0.0.1-SNAPSHOT.jar - Request Timeout: 60 seconds
- Java 21
- Maven 3.6+
- Ollama running locally on port 11434 with Llama 3.1 8B model
- MCP Server JARs placed in
./servers/directory
-
Install Ollama and pull the required model:
ollama pull llama3.1:8b
-
Create servers directory and add MCP server JARs:
mkdir servers # Copy your MCP server JARs to ./servers/ -
Build and run:
mvn clean package java -jar target/mcp-client-assistant-0.0.1-SNAPSHOT.jar
-
Access the service at http://localhost:8080
curl -X POST http://localhost:8080/api/chat \
-H "Content-Type: application/json" \
-d '{"message": "Hello, how are you?"}'curl -X POST http://localhost:8080/api/chat \
-H "Content-Type: application/json" \
-d '{"message": "List files in the current directory"}'curl http://localhost:8080/api/health- Spring Boot 3.5.5 with Java 21
- Spring AI for LLM integration and MCP client support
- Ollama Integration for local AI model execution
- MCP Client for tool execution via external servers
- RESTful API design with JSON request/response
- Conversation Management with in-memory session storage
This project is licensed under the terms specified in the LICENSE file.