MCP 协议 + 大模型 tools 调用
文章目录
mcp server list
Smithery - Model Context Protocol Registry
- 一个集合了各种 mcp-server 的网站
- MCP Market
client
参考:
- GitHub - modelcontextprotocol/python-sdk: The official Python SDK for Model C…
quickstart-resources/mcp-client-python/client.py at main · modelcontextprotoc…
- 一个完整的 client 例子
中文博客
python开发Streamable HTTP MCP应用 - 肖祥 - 博客园
- server + client 代码实现
MCP协议Streamable HTTP - 肖祥 - 博客园
- mcp streamable http 协议介绍
langchain 如何调用 mcp tools
LiteLLM 使用 MCP
/mcp {BETA} - Model Context Protocol | liteLLM
功能:
- mcp 服务代理
使用 mcp 服务
把 mcp tools 转换成 openai 格式的 tools
tools = await litellm.experimental_mcp_client.load_mcp_tools(session=session, format="openai")
调用 mcp tools
call_result = await experimental_mcp_client.call_openai_tool(session=session, openai_tool=openai_tool)
stdio client
创建和调用流程:
启动 server
server 需要指定 server 的启动方法:
- python 命令
- .py 文件
- 环境变量
启动 client session
- session 和 server 的连接方式 (read, write) 相关联
session 获取 server 提供的功能
- prompts:
session.list_prompts() - tools:
session.list_tools() - resources:
session.list_resources()
- prompts:
生成 prompt:session.get_prompt("prompt_name", arguments={k:v, ....})
获取 resource:session.read_resource("resource_uri")eg:
content, mime_type = await session.read_resource("file://some/path")
调用 toolsession.call_tool("tool_name", arguments={k: v, ....})eg:
result = await session.call_tool("tool-name", arguments={"arg1": "value"})
这里的 tooll_name 和 arguments 由大模型的 tools 调用 response 生成
- 在调用大模型时, 把 tools 放入 ChatCompletion 请求中,返回的结果中 会包含类似 content.type == 'tool_use', content.name, content.input 的参数(这是基于 anthropic 例子的调用结果)
tools 调用流程:
把 list_tools() –> 转换成大模型可以使用的格式 + 用户的问题 query
- openai function_call
- langchain tools
- anthropic tools 调用
response –> 使用 tool -> 获得 tool 调用相关数据
- tool_name
- arguments
调用 tool
session.call_tool("tool_name", arguments=arguments)
| |
streamable http server 的连接(client)
| |
转换 session.list
fastapi + streamable http + mcp 例子
server 代码
| |
运行方法
使用 FastMCP.run(transport="streamable-http")
python server.py
使用 fastapi + uvicorn 运行
uvicorn mcp_server:app --host 0.0.0.0 --port 8044
- 需要在 FastAPI 的 lifespan 中先 run FastMCP 的 session_manager
- mount 需要指定 path
- 使用时(在 client 中),需要指定 route –>
path/mcp
client 代码
| |
一个完整的 mcp streamable http_client + openai tools 调用 client
| |
python /data/sawyer/npu-3-sawyer/source/mdq-expstep-wukuang/client.py [Tool(name='temperature_predict', description='tomorrow temperature forecast for a given longitude,lattitude', inputSchema={'properties': {'longitude': {'title': 'Longitude', 'type': 'number'}, 'lattitude': {'title': 'Lattitude', 'type': 'number'}}, 'required': ['longitude', 'lattitude'], 'title': 'temperature_predictArguments', 'type': 'object'}, annotations=None)] available_tools = [{'type': 'function', 'function': {'name': 'temperature_predict', 'description': 'tomorrow temperature forecast for a given longitude,lattitude', 'parameters': {'type': 'object', 'properties': {'longitude': {'title': 'Longitude', 'type': 'number'}, 'lattitude': {'title': 'Lattitude', 'type': 'number'}}, 'required': ['longitude', 'lattitude']}}}] tool_result = CallToolResult(meta=None, content=[TextContent(type='text', text='temperature tomorrow: 26.74℃', annotations=None)], isError=False) [TextContent(type='text', text='temperature tomorrow: 26.74℃', annotations=None)] resp = ChatCompletionMessage(content='', refusal=None, role='assistant', annotations=None, audio=None, function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_y3XaVUTGUmqjaYJt9fWVwMRF', function=Function(arguments='{"longitude":100,"lattitude":88}', name='temperature_predict'), type='function', index=0)], reasoning=None)
tool_result = CallToolResult(meta=None, content=[TextContent(type='text', text='temperature tomorrow: 3.03℃', annotations=None)], isError=False)
resp = ChatCompletion(id='gen-1750323842-3sYR6s9VtrJd4oXhzTDM', choices=[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content="The expected temperature at your location (100° longitude, 88° latitude) is approximately 3.03°C. Here are some wearing suggestions to keep you comfortable at this cool temperature:\n\n1. Layer Up: \n - Wear a thermal base layer (top and bottom) under your clothes for added warmth.\n - A long-sleeve shirt or a lightweight sweater as a middle layer.\n\n2. Outer Layer:\n - A warm, insulated jacket, preferably windproof, to protect against the chill.\n\n3. Bottoms:\n - Wear comfortable jeans or thermal leggings. If you tend to get cold easily, consider thicker trousers.\n\n4. Accessories:\n - A knit hat or beanie to keep your head warm.\n - A scarf to protect your neck and provide extra warmth.\n - Gloves or mittens to keep your hands cozy.\n\n5. Footwear:\n - Insulated boots or warm shoes with thick socks to keep your feet warm.\n\n6. Optional:\n - If it's windy or there's a chance of rain, consider a waterproof outer layer or an umbrella.\n\nMake sure to check if it might get colder in the evening so you can adjust your layers accordingly!", refusal=None, role='assistant', annotations=None, audio=None, function_call=None, tool_calls=None, reasoning=None), native_finish_reason='stop')], created=1750323842, model='openai/gpt-4o-mini', object='chat.completion', service_tier=None, system_fingerprint='fp_34a54ae93c', usage=CompletionUsage(completion_tokens=251, prompt_tokens=75, total_tokens=326, completion_tokens_details=CompletionTokensDetails(accepted_prediction_tokens=None, audio_tokens=None, reasoning_tokens=0, rejected_prediction_tokens=None), prompt_tokens_details=PromptTokensDetails(audio_tokens=None, cached_tokens=0)), provider='OpenAI')
[{'content': 'I want to take part in a party at (100, 88) of longittude and ' 'lattitude, Give me some wearing suggestions based on the ' 'temperature', 'role': 'user'}, {'annotations': None, 'audio': None, 'content': '', 'function_call': None, 'reasoning': None, 'refusal': None, 'role': 'assistant', 'tool_calls': [{'function': {'arguments': '{"longitude":100,"lattitude":88}', 'name': 'temperature_predict'}, 'id': 'call_y3XaVUTGUmqjaYJt9fWVwMRF', 'index': 0, 'type': 'function'}]}, {'content': [TextContent(type='text', text='temperature tomorrow: 3.03℃', annotations=None)], 'name': 'temperature_predict', 'role': 'tool', 'tool_call_id': 'call_y3XaVUTGUmqjaYJt9fWVwMRF'}, {'annotations': None, 'audio': None, 'content': 'The expected temperature at your location (100° longitude, 88° ' 'latitude) is approximately 3.03°C. Here are some wearing ' 'suggestions to keep you comfortable at this cool temperature:\n' '\n' '1. Layer Up: \n' ' - Wear a thermal base layer (top and bottom) under your ' 'clothes for added warmth.\n' ' - A long-sleeve shirt or a lightweight sweater as a middle ' 'layer.\n' '\n' '2. Outer Layer:\n' ' - A warm, insulated jacket, preferably windproof, to protect ' 'against the chill.\n' '\n' '3. Bottoms:\n' ' - Wear comfortable jeans or thermal leggings. If you tend to ' 'get cold easily, consider thicker trousers.\n' '\n' '4. Accessories:\n' ' - A knit hat or beanie to keep your head warm.\n' ' - A scarf to protect your neck and provide extra warmth.\n' ' - Gloves or mittens to keep your hands cozy.\n' '\n' '5. Footwear:\n' ' - Insulated boots or warm shoes with thick socks to keep your ' 'feet warm.\n' '\n' '6. Optional:\n' " - If it's windy or there's a chance of rain, consider a " 'waterproof outer layer or an umbrella.\n' '\n' 'Make sure to check if it might get colder in the evening so you ' 'can adjust your layers accordingly!', 'function_call': None, 'reasoning': None, 'refusal': None, 'role': 'assistant', 'tool_calls': None}]
mcp server 收集
代码运行
CodeRunner
可以运行多种代码, eg: python, javascript, …
搜索 api
brave search
api 有两种形式:
web
- 原生搜索引擎结果
ai
- 被 ai 加工过
价格:
2000 次免费 web search 调用额度
- 3$/1000 次
- 1000 次免费 ai search 调用额度
tavily
ai 搜索(结果可能没有 web 搜索的事实性,但是好处是信息量密集)
额度:
- 1000 次免费调用
google/serper api
谷歌搜索
价格:
- 1$/1000 次
mcp + langgraph
参考:
要点:
使用 Tool 节点:
ToolNodefrom langgraph.prebuilt import ToolNode, tools_condition
graph 的调用:
math_response = await graph.ainvoke({"messages": "what's (3 + 5) x 12?"})
注意
tool 返回结果可能为空
- ToolMessage(content='')
虽然 tool 有返回结果, tool_call –> model_call 大模型没有根据 tool 的返回结果总结或者再次思考用户的问题
- 导致 AIMessage 为空
文章作者
上次更新 2025-09-24 (360d44c)