123
HTTP-SSE超简化MCP服务器客户端,自动创建工具
超简化MCP服务器客户端,自动创建工具
Ultra-minimal setup: Start a server or client in 2 lines.
Easy tool creation: Write normal functions in your tools.py file—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically.
OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
pip install -r requirements.txt
Define your functions in tools.py. No decorators needed, they are automatically added to your MCP server as tools. For example:
def add(a: int, b: int) -> int: """Add two numbers.""" return a + b
from mcp123 import server server.run_server("tools.py", port=9999)
from mcp123.client import McpClient client = McpClient("http://localhost:9999", "sk-...your OpenAI key...")
answer = client.ask("Add 15 and 14.")
print("Answer:", answer)
client.close()
Ultra-minimal setup: Start a server or client in 2 lines.
Easy tool creation: Write normal functions in your tools.py file—no decorators or special wrappers needed—and they get included as tools that your MCP server can use automatically.
OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
Server: Loads all top-level functions from tools.py and exposes them as MCP tools via HTTP.
Client: Discovers available tools, sends prompts to OpenAI, and automatically calls tools if needed.
When you run the client, you’ll see:
Tools discovered:
[ ...list of tools... ]
Answer: 29
Python 3.11+
OpenAI API key (for the client)
Zero boilerplate: No need to write schemas or wrappers—just write functions.
LLM-native: Designed for seamless LLM tool use.
Extensible: Add more tools by simply adding functions.
Built with FastMCP
Inspired by the Model Context Protocol (MCP)
Pull requests and issues are welcome, but only if they are in ALL-CAPS.