Graphiti Pro
STREAMABLE HTTP增强版Graphiti MCP服务器
增强版Graphiti MCP服务器
English | 中文
About Graphiti
Graphiti is a framework for building and querying temporally-aware knowledge graphs, specifically tailored for AI agents operating in dynamic environments. Unlike traditional retrieval-augmented generation (RAG) methods, Graphiti continuously integrates user interactions, structured and unstructured enterprise data, and external information into a coherent, queryable graph. The framework supports incremental data updates, efficient retrieval, and precise historical queries without requiring complete graph recomputation, making it suitable for developing interactive, context-aware AI applications.
This project is an enhanced memory repository MCP service and management platform based on Graphiti. Compared to the original project's MCP service, it offers the following core advantages: enhanced core capabilities, broader AI model compatibility, and comprehensive visual management interface.
Adding memories is the core functionality of the MCP service. We have introduced an asynchronous parallel processing mechanism based on the original implementation. The same group ID (such as different development projects) can execute up to 5 adding memory tasks in parallel, significantly improving processing efficiency.
Four new MCP tools have been added for managing add_memory tasks:
list_add_memory_tasks - List all add_memory tasksget_add_memory_task_status - Get add_memory task statuswait_for_add_memory_task - Wait for add_memory task completioncancel_add_memory_task - Cancel add_memory taskOptimized configuration management to resolve inconsistencies between command-line parameters, environment variables, and management backend database configurations.
[!NOTE] When the management backend is enabled, MCP service parameters in the .env environment configuration file only take effect during the initial startup. Subsequent configurations will be based on parameters in the management backend database.
Through integration with the instructor library, model compatibility has been significantly improved. Now supports various models such as DeepSeek, Qwen, and even locally run models through Ollama, vLLM, as long as they provide OpenAI API compatible interfaces.
The original unified LLM configuration has been split into three independent configurations, allowing flexible combinations based on actual needs:
[!NOTE] When configuring the embedding model, note that its API path differs from the two LLMs above. LLMs use the chat completion path
{base_url}/chat/completions, while text embedding uses{base_url}/embeddings. If you select "Same as Large Model" in the management backend, ensure your configured large model supports text embedding.Additionally, if you run the service via docker compose while the LLM or embedding model is running locally, the base_url needs to be configured as
http://host.docker.internal:{port}, where the port should be adjusted according to your local running port.
To provide better user experience and observability, we have developed a complete management backend and Web UI. Through the management interface, you can:
Clone Project
git clone http://github.com/itcook/graphiti-mcp-pro # or git clone [email protected]:itcook/graphiti-mcp-pro.git cd graphiti-mcp-pro
Configure Environment Variables (Optional)
# Copy example configuration file mv .env.example.en .env # Edit .env file according to the instructions
[!NOTE]
If you want to continue using the previous Graphiti MCP data, set the NEO4J_ related parameters in the .env file to your Neo4j database connection information, and keep other parameters as default.
Start Services
docker compose up -d
[!TIP]
If the project has updates and you need to rebuild the image, use
docker compose up -d --build.Rest assured, data will be persistently saved in the external database and will not be lost.
[!NOTE] Prerequisites:
- Python 3.10+ and uv project manager
- Node.js 20+
- Accessible Neo4j 5.26+ database service
- AI model service
Clone Project
git clone http://github.com/itcook/graphiti-mcp-pro # or git clone [email protected]:itcook/graphiti-mcp-pro.git cd graphiti-mcp-pro
Install Dependencies
uv sync
Configure Environment Variables
# Copy example configuration file mv .env.example.en .env # Edit .env file according to the instructions
Run MCP Service
# Run service with management backend uv run main.py -m # Or run MCP service only # uv run main.py
Build and Run Management Frontend
Enter frontend directory and install dependencies:
cd manager/frontend pnpm install # or npm install / yarn
Build and run frontend:
pnpm run build # or npm run build / yarn build pnpm run preview # or npm run preview / yarn preview
Access management interface: http://localhost:6062
.env.example.enDeveloped with assistance from 🤖 Augment Code