休假管理
STREAMABLE HTTP员工休假管理MCP服务器
员工休假管理MCP服务器
A self-hosted AI stack combining Ollama for running language models, Open WebUI for user-friendly chat interaction, and MCP for centralized model management—offering full control, privacy, and flexibility without relying on the cloud.
This sample project provides an MCP-based tool server for managing employee leave balance, applications, and history. It is exposed via OpenAPI using mcpo for easy integration with Open WebUI or other OpenAPI-compatible clients.
leave-manager/ ├── main.py # MCP server logic for leave management ├── requirements.txt # Python dependencies for the MCP server ├── Dockerfile # Docker image configuration for the leave manager ├── docker-compose.yml # Docker Compose file to run leave manager and Open WebUI └── README.md # Project documentation (this file)
deepseek-r1 modeldocker-compose.yml file to launch servicesDownload the Installer:
OllamaSetup.exe.Run the Installer:
OllamaSetup.exe and follow the installation prompts.Ollama is running

Start Ollama Server (if not already running):
ollama serve
Check the installed version of Ollama:
ollama --version
Expected Output:
ollama version 0.7.1
deepseek-r1 Modelollama pull deepseek-r1

To Pull Specific Versions:
ollama run deepseek-r1:1.5b ollama run deepseek-r1:671b
ollama list
Expected:
Expected Output:
NAME ID SIZE deepseek-r1:latest xxxxxxxxxxxx X.X GB

curl http://localhost:11434/api/tags
Expected Output:
A JSON response listing installed models, including deepseek-r1:latest.

Invoke-RestMethod -Uri http://localhost:11434/api/generate -Method Post -Body '{"model": "deepseek-r1", "prompt": "Hello, world!", "stream": false}' -ContentType "application/json"
Expected Response: A JSON object containing the model's response to the "Hello, world!" prompt.

ollama run deepseek-r1
deepseek-r1 model./bye and press Enter to exit the chat session.


Clone the Repository:
git clone https://github.com/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows.git cd Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows
To launch both the MCP tool and Open WebUI locally (on Docker Desktop):
docker-compose up --build

This will:
8000The MCP tools are exposed via the OpenAPI specification at: http://localhost:8000/openapi.json.



Use these prompts in Open WebUI to interact with the Leave Manager tool:
Check how many leave days are left for employee E001

Apply

What's the leave history of E001?

Greet me as Alice

ollama serve) and check http://localhost:11434.deepseek-r1 model is listed with ollama list.11434, 3000, and 8000 are free.