English | 中文
Azure AI Foundry Agent MCP Server with user isolation and session management.
pip install -r requirements.txtCreate .env file:
cp .env.example .env
# Edit .env file with your Azure configuration# HTTP MCP mode (Recommended for VSCode GitHub Copilot)
python start_server.py --mode http
# Other modes
python start_server.py --mode stdio # stdio MCP
python start_server.py --mode api # RESTful APIAdd MCP server configuration in Claude Code or VSCode GitHub Copilot:
{
"servers": {
"ai-foundry-agent": {
"type": "streamableHttp",
"url": "http://127.0.0.1:8000/mcp/",
"headers": {
"Content-Type": "application/json",
"Authorization": "Bearer your_token_here"
}
}
}
}For stdio-based MCP clients, configure environment variable and client config:
Environment Variable:
export MCP_STDIO_TOKEN="your_token_here"
# On Windows:
set MCP_STDIO_TOKEN=your_token_hereMCP Client Configuration:
{
"servers": {
"ai-foundry-agent": {
"type": "stdio",
"command": "python",
"args": ["start_server.py", "--mode", "stdio"],
"cwd": "/path/to/ai-foundry-agent-mcp",
"env": {
"MCP_STDIO_TOKEN": "your_token_here"
}
}
}
}This is a Model Context Protocol (MCP) server that enables developers to access Azure AI Foundry Agents through VSCode GitHub Copilot. The core design philosophy is to provide seamless integration of Azure AI services into VSCode development workflows through standardized MCP protocol.
- 🔐 Token-based Authentication: Uses Bearer token in Authorization header for user identification
- 👥 User Isolation: Each token corresponds to independent Azure AI Agent threads, ensuring data security
- 💬 Session Management: Support for clearing conversations and creating new sessions with flexible context control
- 🚀 Multiple Deployment Modes: HTTP MCP, stdio MCP, and RESTful API deployment options
- 🛠️ Complete MCP Support: Full implementation of standard MCP protocol core features
- 🧪 Test Mode: Built-in mock functionality for development without Azure credentials
- Python 3.8+
- Azure AI Foundry Account: Requires Azure AI Project and Agent setup
- Azure App Registration: Application credentials for service authentication
| Variable | Description | Example |
|---|---|---|
AZURE_TENANT_ID |
Azure Tenant ID | 12345678-1234-1234-1234-123456789012 |
AZURE_CLIENT_ID |
Azure Application Client ID | 87654321-4321-4321-4321-210987654321 |
AZURE_CLIENT_SECRET |
Azure Application Client Secret | your_client_secret_here |
AZURE_ENDPOINT |
Azure AI Project Endpoint | https://your-project.services.ai.azure.com/api/projects/your-project |
AZURE_AGENT_ID |
Azure AI Agent ID | asst_xxxxxxxxxxxxxxxxx |
MCP_STDIO_TOKEN |
Token for stdio mode authentication (optional) | your_stdio_token_here |
python start_server.py [options]
Options:
--mode {http,stdio,api} Server mode (default: http)
--port PORT HTTP port (default: 8000)
--log-level LEVEL Log level (default: INFO)
--help Show help informationSend message to Azure AI Agent and receive response.
Parameters:
message(string): Message content to send
Example:
{
"name": "send_message",
"arguments": {
"message": "Hello, please introduce yourself"
}
}List all messages in current conversation thread.
Example:
{
"name": "list_messages",
"arguments": {}
}Clear current conversation and start new thread.
Example:
{
"name": "clear_conversation",
"arguments": {}
}Start new conversation thread (equivalent to clear_conversation).
Example:
{
"name": "new_conversation",
"arguments": {}
}- POST
/mcp/- MCP protocol message handling - GET
/mcp/- Service health check
- POST
/api/send_message- Send message - GET
/api/list_messages- List message history - POST
/api/clear_conversation- Clear conversation - POST
/api/new_conversation- Create new conversation
When running the server in API mode (--mode api), you can interact with the Azure AI Foundry Agent through standard HTTP requests using curl or any HTTP client.
All API requests require authentication via Bearer token in the Authorization header:
Authorization: Bearer your_token_hereSend a message to the Azure AI Agent and receive a response.
Endpoint: POST /api/send_message
Request Body:
{
"message": "Hello, please introduce yourself"
}curl Example:
curl -X POST http://localhost:8000/api/send_message \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your_token_here" \
-d '{
"message": "Hello, please introduce yourself"
}'Response:
{
"success": true,
"response": "Hello! I'm an AI assistant powered by Azure AI Foundry...",
"message_id": "msg_abc123"
}Retrieve all messages in the current conversation thread.
Endpoint: GET /api/list_messages
curl Example:
curl -X GET http://localhost:8000/api/list_messages \
-H "Authorization: Bearer your_token_here"Response:
{
"success": true,
"messages": [
{
"id": "msg_abc123",
"role": "user",
"content": "Hello, please introduce yourself",
"timestamp": "2024-01-15T10:30:00Z"
},
{
"id": "msg_def456",
"role": "assistant",
"content": "Hello! I'm an AI assistant...",
"timestamp": "2024-01-15T10:30:05Z"
}
],
"thread_id": "thread_xyz789"
}Clear the current conversation and start a new thread.
Endpoint: POST /api/clear_conversation
curl Example:
curl -X POST http://localhost:8000/api/clear_conversation \
-H "Authorization: Bearer your_token_here"Response:
{
"success": true,
"message": "Conversation cleared successfully",
"new_thread_id": "thread_new123"
}Start a new conversation thread (equivalent to clear_conversation).
Endpoint: POST /api/new_conversation
curl Example:
curl -X POST http://localhost:8000/api/new_conversation \
-H "Authorization: Bearer your_token_here"Response:
{
"success": true,
"message": "New conversation started",
"thread_id": "thread_new456"
}Authentication Error:
# Missing or invalid token
curl -X POST http://localhost:8000/api/send_message \
-H "Content-Type: application/json" \
-d '{"message": "test"}'Response:
{
"success": false,
"error": "Unauthorized: Invalid or missing token",
"code": 401
}Validation Error:
# Missing required field
curl -X POST http://localhost:8000/api/send_message \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your_token_here" \
-d '{}'Response:
{
"success": false,
"error": "Missing required field: message",
"code": 400
}# 1. Start a new conversation
curl -X POST http://localhost:8000/api/new_conversation \
-H "Authorization: Bearer your_token_here"
# 2. Send a message
curl -X POST http://localhost:8000/api/send_message \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your_token_here" \
-d '{
"message": "What is Azure AI Foundry?"
}'
# 3. Send a follow-up message
curl -X POST http://localhost:8000/api/send_message \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your_token_here" \
-d '{
"message": "Can you give me more details about its features?"
}'
# 4. List all messages in the conversation
curl -X GET http://localhost:8000/api/list_messages \
-H "Authorization: Bearer your_token_here"
# 5. Clear conversation when done
curl -X POST http://localhost:8000/api/clear_conversation \
-H "Authorization: Bearer your_token_here"Currently uses simplified Token validation (Mock mode). Production deployment requires replacing with real authentication logic:
def validate_token(token: str) -> bool:
# Implement real token validation logic
# e.g., JWT validation, database query, etc.
return verify_jwt_token(token) # exampleCritical for Data Security: Ensures complete data separation between users to prevent information leakage and maintain privacy compliance (GDPR, HIPAA, SOC 2).
Implementation:
- User ID Generation: SHA256 hash of token (first 16 chars) ensures deterministic but secure mapping
- Thread Isolation: Each user ID gets dedicated Azure AI Agent thread with separate conversation contexts
- Data Persistence: User-thread mapping saved in
user_thread_mapping.jsonfor consistency across restarts - Concurrency Safety: Thread locks prevent race conditions in concurrent access scenarios
Security Considerations: Production deployments should implement proper token validation, rotation mechanisms, audit logging, and per-user rate limiting.
# Run manual test to verify core functionality
python manual_test.pyProject supports test mode without Azure credentials, enable with environment variable TEST_MODE=true.
Enable verbose logging:
python start_server.py --mode http --log-level debugUse test mode:
export TEST_MODE=true
python start_server.py --mode httpMIT License - See LICENSE file for details