Customized MCP Project
This project leverages the mcp
library with CLI support and integrates with OpenAI’s API.
Requirements
Make sure to install the required dependencies before running the project:
pip install -r requirements.txt
Usage
Configure your OpenAI API key as an environment variable:
export OPENAI_API_KEY="your-api-key"
Start the MCP server:
python server.py
Use the client to interact with the server:
python client.py
Alternatively, use the orchestrator to query the LLM and tools:
python main.py
Example
Querying the Weather Tool
Run the client and call the get_weather
tool:
python client.py
Example interaction:
You: List tools
Assistant: {
"tools": [
{
"name": "get_weather",
"description": "Get weather for a city",
"parameters": {
"city": {
"type": "string",
"description": "Name of the city"
}
}
}
]
}
You: Call get_weather with {"city": "Beijing"}
Assistant: 北京的天气是晴天
Dependencies
openai==1.70.0
mcp[cli]==1.6.0
License
This project is licensed under the MIT License.
Customized MCP Project
Project Details
- MorvanZhou/customized_mcp
- MIT License
- Last Updated: 4/7/2025
Recomended MCP Servers
Call another MCP client from your MCP client. Offload context windows, delegate tasks, split between models
OpenAPI specification MCP server.
Ollama_MCP_Guidance
Model Context Protocol server for KiCad on Mac, Windows, and Linux
MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)
MCP Server for the Fillout.io API, enabling form management, response handling, and analytics.
Model Context Protocol Servers