A sample implementation of a production-ready MCP setup (client / server) with authentication, fully implemented in Typescript (Node) using modern ES modules.
Note
For the sake of demonstration, the agent connects to a locally hosted LLM, served by Ollama. Feel free to integrate an AI stack of your choice instead (see client implementation below).
Before running either the server or client, ensure you have the following prerequisites in place.
-
Node.js (v18 or higher)
- Download and install from nodejs.org
- Verify installation:
node --version
-
Ollama (for the client demonstration)
π Feel free to use another LLM (running locally or in the cloud and ajust the client implementation below):- Install Ollama from ollama.ai
- Pull the required model:
ollama pull qwen3:4b - Ensure Ollama is running on
http://localhost:11434
This reference implementation needs a Microsoft Entra ID tenant with an app registration configured:
-
Microsoft Entra ID App Registration
- Configure a new app registration in Microsoft Entra ID:
- API permissions with the custom scope
mcp:tools - Authentication settings (for interactive login in the client)
- Expose an API with the required scope
- API permissions with the custom scope
[!TIP] Regarding the API scope: make sure you add a dedicated scope by exposing a web api in your app registration (follow this link in case you need any guidance: https://learn.microsoft.com/en-us/entra/identity-platform/quickstart-configure-app-expose-web-apis#add-a-scope-requiring-admin-and-user-consent).
- Configure a new app registration in Microsoft Entra ID:
This information will be used within the client and server setup:
AZURE_TENANT_ID: Your Entra ID tenantAZURE_CLIENT_ID: Your app registration client IDAZURE_CLIENT_SECRET: Your app registration client secret (for server)AZURE_CLIENT_SCOPE: The API scope (format:api://<client-id>/mcp:tools)
This monorepo contains two main components:
/server: MCP server implementation with Express.js, JWT authentication, and MCP SDK/client: MCP client implementation with AI SDK integration and interactive CLI
Key Features:
- JWT Authentication: Validates tokens from Entra ID using JWKS
- Stateless Mode: Creates new transport for each request to prevent ID collisions
- Secure API: Validates scopes, issuer, and token signatures
- Express.js: HTTP server with JSON-RPC over HTTP transport
Server Architecture:
server/
βββ src/
β βββ main.ts # Express server setup
β βββ auth/
β β βββ validation.ts # JWT validation middleware
β βββ context/
β β βββ AuthContext.ts # Authentication context management
β βββ mcp/
β β βββ server.ts # MCP server implementation
β βββ utils/
β βββ api-client.ts # API client utilities
Key Features:
- Interactive Authentication: Uses Entra ID interactive token request flow
- MCP Client: Connects to remote MCP server with HTTP transport
- AI SDK Integration: Works with Vercel AI SDK for LLM interactions
- Streaming Responses: Real-time streaming of LLM responses
- Tool Calling: Automatically invokes MCP tools based on LLM requests
Client Architecture:
client/
βββ src/
β βββ main.ts # CLI application entry point
β βββ bootstrap.ts # Environment configuration loader
β βββ mcp/
β β βββ client.ts # MCP client factory
β βββ utils/
β βββ TokenAcquisitionHelper.ts # Entra ID token acquisition
The MCP server provides authenticated access to MCP tools via HTTP transport.
Navigate to the server directory and create a .env file from the template:
cd server
cp .env.template .envEdit .env with your Entra ID credentials:
# Entra ID Configuration for OAuth2/MSAL
AZURE_TENANT_ID=your-tenant-id-here
AZURE_CLIENT_ID=your-client-id-here
AZURE_CLIENT_SECRET=your-client-secret-here
# MCP Server Configuration
REMOTE_MCP_SERVER_PORT=3000npm installStart the server in development mode with auto-reload:
npm startOr build and run in production mode:
npm run build
node dist/main.js serverThe server should be running on http://localhost:3000/mcp. You should see:
MCP Server running on http://localhost:3000/mcp
The MCP client demonstrates how to connect to the MCP server, acquire authentication tokens, and interact with LLMs using MCP tools.
Navigate to the client directory and create a .env.development file from the template:
cd client
cp .env.template .env.developmentEdit .env.development with your configuration:
# Entra ID Configuration for OAuth2/MSAL
AZURE_TENANT_ID=your-tenant-id-here
AZURE_CLIENT_ID=your-client-id-here
AZURE_CLIENT_SCOPE=api://your-client-id-here/mcp:tools
REMOTE_MCP_SERVER_BASE_URL=http://localhost:3000/mcpnpm installBefore running the client:
-
β MCP Server is running (see above: Server > Run the server )
-
β Ollama is running with the required model:
ollama serve ollama pull qwen3:4b
Start the interactive CLI application:
npm start- The application will prompt you to authenticate with Entra ID (interactive browser login)
- After successful authentication, you'll see the registered MCP tools
- Start asking questions in the interactive prompt
Example interaction:
π§ Simple console application
Acquiring an access token from Entra ID... β
ok
Registered MCP Server tools: ['tool1', 'tool2', ...]
Starting the command line prompting game.
Have fun! π Type 'exit' or 'quit' to terminate.
Ask: How many users work in my company?
AI's response:
[Streaming response from LLM using MCP tools...]
The client includes examples for both Ollama and OpenAI. To use OpenAI instead:
- Uncomment the OpenAI example in
client/src/main.ts - Set your OpenAI API key:
export OPENAI_API_KEY=your-key - Replace the
streamTextcall with thegenerateTextcall
- Port already in use: Change
REMOTE_MCP_SERVER_PORTin.envor terminate the already running server - Token validation fails: Verify Entra ID app registration configuration
- JWKS errors: Check that your app registration has the correct signing keys
- Authentication fails: Ensure Entra ID app has correct redirect URIs configured
- Cannot connect to server: Verify
REMOTE_MCP_SERVER_BASE_URLand that server is running - Ollama errors: Ensure Ollama is running (
ollama serve) and model is pulled
- Module resolution errors: Ensure you're using Node.js v22+ with ES modules support
- TypeScript errors: Run
npm run type-checkto validate TypeScript configuration - Network errors: Check firewall settings and localhost connectivity
Both server and client support the following npm scripts:
npm start: Run in development modenpm run build: Compile TypeScript to JavaScriptnpm run type-check: Check TypeScript types without emitting files
- Never commit
.envfiles: These files contain sensitive credentials - Token validation: The server validates tokens using JWKS from Entra ID
- Scope checking: Ensure the
mcp:toolsscope is properly configured - Production deployment: Use proper secret management (Azure Key Vault, environment variables, etc.)