当前位置: 首页 > news >正文

简单使用MCP

1、说明

# 测试环境服务器
CPU数量:2核
内存:4GB
磁盘:50GB# 补充
如果不想使用Docker进行操作,只需要跳过Docker相关命令操作
即:使用Ollama运行模型,使用Python来创建MCP

2、安装Docker

# 安装Docker
https://docs.docker.com/get-docker/# 安装Docker Compose
https://docs.docker.com/compose/install/# CentOS安装Docker
https://mp.weixin.qq.com/s/nHNPbCmdQs3E5x1QBP-ueA

3、安装Ollama、Python

创建docker-compose.yaml文件:

services:ollama:image: ollama/ollamacontainer_name: mcp-ollamaports:- "11434:11434"python:image: python:3.11-alpinecontainer_name: mcp-pythontty: true

创建并启动容器:

docker-compose up -d

查看容器列表:

docker ps

停止并销毁容器:

docker-compose down

删除镜像:

docker rmi ollama/ollama python:3.11-alpine

3.1、ollama容器

详见:
https://ollama.com/
https://github.com/ollama/ollama/blob/main/docs/docker.md
# 进入容器:
docker exec -it mcp-ollama bash# 运行模型:
ollama run qwen3:0.6b

3.2、python容器

详见:
https://developer.aliyun.com/mirror/alpine
https://developer.aliyun.com/mirror/pypi
# 进入容器:
docker exec -it mcp-python sh# 备份:
cp /etc/apk/repositories /etc/apk/repositories-bak# 修改镜像源:
sed -i  's/dl-cdn.alpinelinux.org/mirrors.aliyun.com/g' /etc/apk/repositories# 创建~/.pip目录
mkdir ~/.pip# 编辑~/.pip/pip.conf文件
cat << EOF > ~/.pip/pip.conf
[global]
index-url = http://mirrors.aliyun.com/pypi/simple/[install]
trusted-host=mirrors.aliyun.com
EOF

4、测试

详见:
https://modelcontextprotocol.io/quickstart/server
https://modelcontextprotocol.io/quickstart/client
https://mcp-docs.cn/quickstart/server
https://mcp-docs.cn/quickstart/client
https://github.com/modelcontextprotocol/quickstart-resources/blob/main/weather-server-python/weather.py
https://github.com/modelcontextprotocol/quickstart-resources/blob/main/mcp-client-python/client.py
https://github.com/ollama/ollama-python

进入容器:

# 进入容器:
docker exec -it mcp-python sh# 切换目录
cd /root

安装uv:

pip install uv

创建项目:

# 为我们的项目创建一个新 directory
uv init weather
cd weather# 创建 virtual environment 并激活它
uv venv
source .venv/bin/activate# 安装 dependencies
uv add "mcp[cli]" httpx python-dotenv anthropic ollama# 创建我们的 server file、client file
touch server.py client.py

创建server.py文件:

详见:https://github.com/modelcontextprotocol/quickstart-resources/blob/main/weather-server-python/weather.py
from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP# Initialize FastMCP server
mcp = FastMCP("weather")# Constants
NWS_API_BASE = "https://api.weather.gov"
USER_AGENT = "weather-app/1.0"async def make_nws_request(url: str) -> dict[str, Any] | None:"""Make a request to the NWS API with proper error handling."""headers = {"User-Agent": USER_AGENT,"Accept": "application/geo+json"}async with httpx.AsyncClient() as client:try:response = await client.get(url, headers=headers, timeout=30.0)response.raise_for_status()return response.json()except Exception:return Nonedef format_alert(feature: dict) -> str:"""Format an alert feature into a readable string."""props = feature["properties"]return f"""
Event: {props.get('event', 'Unknown')}
Area: {props.get('areaDesc', 'Unknown')}
Severity: {props.get('severity', 'Unknown')}
Description: {props.get('description', 'No description available')}
Instructions: {props.get('instruction', 'No specific instructions provided')}
"""@mcp.tool()
async def get_alerts(state: str) -> str:"""Get weather alerts for a US state.Args:state: Two-letter US state code (e.g. CA, NY)"""url = f"{NWS_API_BASE}/alerts/active/area/{state}"data = await make_nws_request(url)if not data or "features" not in data:return "Unable to fetch alerts or no alerts found."if not data["features"]:return "No active alerts for this state."alerts = [format_alert(feature) for feature in data["features"]]return "\n---\n".join(alerts)@mcp.tool()
async def get_forecast(latitude: float, longitude: float) -> str:"""Get weather forecast for a location.Args:latitude: Latitude of the locationlongitude: Longitude of the location"""# First get the forecast grid endpointpoints_url = f"{NWS_API_BASE}/points/{latitude},{longitude}"points_data = await make_nws_request(points_url)if not points_data:return "Unable to fetch forecast data for this location."# Get the forecast URL from the points responseforecast_url = points_data["properties"]["forecast"]forecast_data = await make_nws_request(forecast_url)if not forecast_data:return "Unable to fetch detailed forecast."# Format the periods into a readable forecastperiods = forecast_data["properties"]["periods"]forecasts = []for period in periods[:5]:  # Only show next 5 periodsforecast = f"""
{period['name']}:
Temperature: {period['temperature']}°{period['temperatureUnit']}
Wind: {period['windSpeed']} {period['windDirection']}
Forecast: {period['detailedForecast']}
"""forecasts.append(forecast)return "\n---\n".join(forecasts)if __name__ == "__main__":# Initialize and run the servermcp.run(transport='stdio')

创建client.py文件:

详见:https://github.com/modelcontextprotocol/quickstart-resources/blob/main/mcp-client-python/client.py
说明:"需要增加process_query_modify、process_tools方法""根据实际情况修改下面代码中的host的值"
self.ollama = AsyncClient(host="http://localhost:11434")
import asyncio
from typing import Optional
from contextlib import AsyncExitStackfrom mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_clientfrom anthropic import Anthropic
from dotenv import load_dotenvfrom ollama import AsyncClientload_dotenv()  # load environment variables from .envclass MCPClient:def __init__(self):# Initialize session and client objectsself.session: Optional[ClientSession] = Noneself.exit_stack = AsyncExitStack()# self.anthropic = Anthropic()#self.ollama = AsyncClient(host="http://localhost:11434")self.ollama = AsyncClient(host="http://ollama:11434")async def connect_to_server(self, server_script_path: str):"""Connect to an MCP serverArgs:server_script_path: Path to the server script (.py or .js)"""is_python = server_script_path.endswith('.py')is_js = server_script_path.endswith('.js')if not (is_python or is_js):raise ValueError("Server script must be a .py or .js file")command = "python" if is_python else "node"server_params = StdioServerParameters(command=command,args=[server_script_path],env=None)stdio_transport = await self.exit_stack.enter_async_context(stdio_client(server_params))self.stdio, self.write = stdio_transportself.session = await self.exit_stack.enter_async_context(ClientSession(self.stdio, self.write))await self.session.initialize()# List available toolsresponse = await self.session.list_tools()tools = response.toolsprint("\nConnected to server with tools:", [tool.name for tool in tools])async def process_query(self, query: str) -> str:"""Process a query using Claude and available tools"""messages = [{"role": "user","content": query}]response = await self.session.list_tools()available_tools = [{ "name": tool.name,"description": tool.description,"input_schema": tool.inputSchema} for tool in response.tools]# Initial Claude API callresponse = self.anthropic.messages.create(model="claude-3-5-sonnet-20241022",max_tokens=1000,messages=messages,tools=available_tools)# Process response and handle tool callsfinal_text = []for content in response.content:if content.type == 'text':final_text.append(content.text)elif content.type == 'tool_use':tool_name = content.nametool_args = content.input# Execute tool callresult = await self.session.call_tool(tool_name, tool_args)final_text.append(f"[Calling tool {tool_name} with args {tool_args}]")# Continue conversation with tool resultsif hasattr(content, 'text') and content.text:messages.append({"role": "assistant","content": content.text})messages.append({"role": "user", "content": result.content})# Get next response from Clauderesponse = self.anthropic.messages.create(model="claude-3-5-sonnet-20241022",max_tokens=1000,messages=messages,)final_text.append(response.content[0].text)return "\n".join(final_text)async def process_query_modify(self, query: str) -> str:"""Process a query using Claude and available tools"""messages = [{"role": "user","content": query}]response = await self.session.list_tools()available_tools = [{ "name": tool.name,"description": tool.description,"input_schema": tool.inputSchema} for tool in response.tools]print("------1------\n",response)tools = self.process_tools(response.tools)model_name = "qwen3:0.6b"# Initial Claude API callresponse = await self.ollama.chat(model=model_name,messages=messages,tools=tools)# Process response and handle tool callsfinal_text = []print("------2------\n",response)if response.message.tool_calls:function = response.message.tool_calls[0].get("function")tool_name = function.get("name")tool_args = function.get("arguments")result = await self.session.call_tool(tool_name, tool_args)final_text.append(f"[Calling tool {tool_name} with args {tool_args}]")print("------3------\n",result)messages.append({"role": "user","content": result.content[0].text})response = await self.ollama.chat(model=model_name,messages=messages,)final_text.append(response.message.content)return "\n".join(final_text)# https://github.com/ollama/ollama-python/blob/main/examples/tools.pydef process_tools(self, response_tools) -> []:tools = []for tool in response_tools:item = {'type': 'function','function': {'name': tool.name,'description': tool.description,'parameters': {'type': tool.inputSchema.get('type'),'required': tool.inputSchema.get('required'),'properties': tool.inputSchema.get('properties'),},},}tools.append(item)return toolsasync def chat_loop(self):"""Run an interactive chat loop"""print("\nMCP Client Started!")print("Type your queries or 'quit' to exit.")while True:try:query = input("\nQuery: ").strip()if query.lower() == 'quit':breakresponse = await self.process_query_modify(query)print("\n" + response)except Exception as e:print(f"\nError: {str(e)}")async def cleanup(self):"""Clean up resources"""await self.exit_stack.aclose()async def main():if len(sys.argv) < 2:print("Usage: python client.py <path_to_server_script>")sys.exit(1)client = MCPClient()try:await client.connect_to_server(sys.argv[1])await client.chat_loop()finally:await client.cleanup()if __name__ == "__main__":import sysasyncio.run(main())

执行脚本:

uv run client.py server.py

对话内容:

# 示例1:加利福尼亚州天气
What are the weather alerts in California# 示例2:纽约州天气
What are the weather alerts in New York# 示例3:俄亥俄州天气
What are the weather alerts in Ohio# 示例4:
使用go语言写一段冒泡排序

5、详见

https://modelcontextprotocol.io/quickstart
https://mcp-docs.cn/quickstart
https://github.com/modelcontextprotocol/quickstart-resources
https://github.com/ollama/ollama-python
https://github.com/fufankeji/LLMs-Technology-Community-Beyondata
https://blog.csdn.net/u012894975/article/details/147828391
https://gitee.com/MarsBighead/hawthorn
https://mp.weixin.qq.com/s/i36MBzaO6obseW_Zln0rUA
http://www.lryc.cn/news/592956.html

相关文章:

  • 4644电源管理芯片在微波射频组件中的技术优势与国产化实践
  • 比亚迪古德伍德亮相:从技术突破到文化对话
  • JavaSE -- 数组详细讲解(数组介绍,Arrays常用方法,二维数组创建)
  • CMake指令:常见内置命令行工具( CMake -E )
  • MyBatis-Flex代码生成
  • Google Gemini CLI 配置简要指南
  • 数字化转型:概念性名词浅谈(第三十一讲)
  • 前端-CSS盒模型、浮动、定位、布局
  • 张力场中的领航者:驾驭二元对立的“情境智慧”模型
  • Vue3 从 0 到 ∞:Composition API 的底层哲学、渲染管线与生态演进全景
  • C++---cout、cerr、clog
  • 上网行为管理-web认证服务
  • JavaScript 的垃圾回收机制
  • 20250718-5-Kubernetes 调度-Pod对象:重启策略+健康检查_笔记
  • Go-Redis 入门与实践从连接到可观测,一站式掌握 go-redis v9**
  • C++ :vector的介绍和使用
  • 快速安装GitLab指南
  • Selenium 攻略:从元素操作到 WebDriver 实战
  • 最小生成树算法详解
  • FastAdmin框架超级管理员密码重置与常规admin安全机制解析-卓伊凡|大东家
  • Android性能优化之包体积优化
  • 【DataWhale】快乐学习大模型 | 202507,Task03笔记
  • Spring全面讲解(无比详细)
  • MySQL中的锁有哪些
  • 高速板材的DK 与 DF
  • 门控线性单元GLU (Gated Linear Unit)
  • Zabbix安装-Server
  • 暑期自学嵌入式——Day05补充(C语言阶段)
  • 百炼MCP与IoT实战(三):手搓自定义MCP Server与阿里云FC配置
  • 「Java案例」判断是否是闰年的方法