MarkTechPost@AI 前天 01:10
A Step-by-Step Coding Guide to Defining Custom Model Context Protocol (MCP) Server and Client Tools with FastMCP and Integrating Them into Google Gemini 2.0’s Function‑Calling Workflow
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文提供了一个逐步指南,演示了如何将 Google 的 Gemini 2.0 生成式 AI 与 FastMCP 结合使用,以创建自定义模型上下文协议 (MCP) 服务器和客户端工具,并将其集成到 Gemini 2.0 的函数调用工作流程中。通过使用 Python 库,该教程展示了如何安全地处理 API 密钥、定义 MCP 工具、与 Gemini 模型交互以及获取结构化天气数据。该方法简化了在 Colab 中开发和测试 MCP 集成的过程。

🔑 通过 getpass 模块安全获取并存储 GEMINI_API_KEY,用于后续 Gemini API 的身份验证,确保 API 密钥不被直接显示。

⚙️ 安装必要的 Python 依赖库,包括 google-genai、fastmcp、httpx 和 nest_asyncio,以支持 Gemini API 调用、MCP 服务器和客户端的构建、HTTP 请求以及 Colab 环境的异步操作。

🛠️ 创建一个 FastMCP 服务器,并注册两个工具:get_weather 和 get_alerts,前者用于从 Open-Meteo API 获取三天的天气预报,后者用于返回美国州级别的天气警报占位符信息。

💡 定义 JSON 模式规范,用于描述 get_weather 和 get_alerts 工具的输入参数,包括名称、描述、必需属性和数据类型,这有助于 Gemini 生成和验证相应的函数调用。

🚀 编写 run_gemini 异步函数,它通过 MCP 客户端向 Gemini 发送自然语言提示,捕获函数调用,调用相应的 MCP 工具,并打印结构化的天气数据,完成整个流程。

In this Colab‑ready tutorial, we demonstrate how to integrate Google’s Gemini 2.0 generative AI with an in‑process Model Context Protocol (MCP) server, using FastMCP. Starting with an interactive getpass prompt to capture your GEMINI_API_KEY securely, we install and configure all necessary dependencies: the google‑genai Python client for calling the Gemini API, fastmcp for defining and hosting our MCP tools in‑process, httpx for making HTTP requests to the Open‑Meteo weather API, and nest_asyncio to patch Colab’s already‑running asyncio event loop. The workflow proceeds by spinning up a minimal FastMCP “weather” server with two tools, get_weather(latitude, longitude) for a three‑day forecast and get_alerts(state) for state‑level weather alerts, then creating a FastMCPTransport to connect an MCP client to that server. Finally, using the Gemini function‑calling feature, we send a natural‑language prompt to Gemini, have it emit a function call based on our explicit JSON schemas, and then execute that call via the MCP client, returning structured weather data into our notebook.

from getpass import getpassimport osapi_key = getpass("Enter your GEMINI_API_KEY: ")os.environ["GEMINI_API_KEY"] = api_key

We securely prompt you to enter your Gemini API key (without displaying it on the screen) and then store it in the GEMINI_API_KEY environment variable, allowing the rest of your notebook to authenticate with Google’s API.

!pip install -q google-genai mcp fastmcp httpx nest_asyncio

We install all the core dependencies needed for our Colab notebook in one go—google‑genai for interacting with the Gemini API, mcp and fastmcp for building and hosting our Model Context Protocol server and client, httpx for making HTTP requests to external APIs, and nest_asyncio to patch the event loop so our async code runs smoothly.

We apply the nest_asyncio patch to the notebook’s existing event loop, allowing us to run asyncio coroutines (like our MCP client interactions) without encountering “event loop already running” errors.

from fastmcp import FastMCPimport httpxmcp_server = FastMCP("weather")@mcp_server.tool()def get_weather(latitude: float, longitude: float) -> str:    """3‑day min/max temperature forecast via Open‑Meteo."""    url = (        f"https://api.open-meteo.com/v1/forecast"        f"?latitude={latitude}&longitude={longitude}"        "&daily=temperature_2m_min,temperature_2m_max&timezone=UTC"    )    resp = httpx.get(url, timeout=10)    daily = resp.json()["daily"]    return "\n".join(        f"{date}: low {mn}°C, high {mx}°C"        for date, mn, mx in zip(            daily["time"],            daily["temperature_2m_min"],            daily["temperature_2m_max"],        )    )@mcp_server.tool()def get_alerts(state: str) -> str:    """Dummy US‑state alerts."""    return f"No active weather alerts for {state.upper()}."

We create an in‑process FastMCP server named “weather” and register two tools: get_weather(latitude, longitude), which fetches and formats a 3‑day temperature forecast from the Open‑Meteo API using httpx, and get_alerts(state), which returns a placeholder message for U.S. state weather alerts.

import asynciofrom google import genaifrom google.genai import typesfrom fastmcp import Client as MCPClientfrom fastmcp.client.transports import FastMCPTransport

We import the core libraries for our MCP‑Gemini integration: asyncio to run asynchronous code, google‑genai and its types module for calling Gemini and defining function‑calling schemas, and FastMCP’s Client (aliased as MCPClient) with its FastMCPTransport to connect our in‑process weather server to the MCP client.

client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))MODEL = "gemini-2.0-flash"transport = FastMCPTransport(mcp_server)

We initialize the Google Gemini client using the GEMINI_API_KEY from your environment, specify the gemini-2.0-flash model for function‑calling, and set up a FastMCPTransport that connects the in‑process mcp_server to the MCP client.

function_declarations = [    {        "name": "get_weather",        "description": "Return a 3‑day min/max temperature forecast for given coordinates.",        "parameters": {            "type": "object",            "properties": {                "latitude": {                    "type": "number",                    "description": "Latitude of target location."                },                "longitude": {                    "type": "number",                    "description": "Longitude of target location."                }            },            "required": ["latitude", "longitude"]        }    },    {        "name": "get_alerts",        "description": "Return any active weather alerts for a given U.S. state.",        "parameters": {            "type": "object",            "properties": {                "state": {                    "type": "string",                    "description": "Two‑letter U.S. state code, e.g. 'CA'."                }            },            "required": ["state"]        }    }]tool_defs = types.Tool(function_declarations=function_declarations)

We manually define the JSON schema specifications for our two MCP tools, get_weather (which accepts latitude and longitude as numeric inputs) and get_alerts (which accepts a U.S. state code as a string), including names, descriptions, required properties, and data types. It then wraps these declarations in types. Tool object (tool_defs), which informs Gemini how to generate and validate the corresponding function calls.

async def run_gemini(lat: float, lon: float):    async with MCPClient(transport) as mcp_client:        prompt = f"Give me a 3‑day weather forecast for latitude={lat}, longitude={lon}."        response = client.models.generate_content(            model=MODEL,            contents=[prompt],            config=types.GenerateContentConfig(                temperature=0,                tools=[tool_defs]            )        )        call = response.candidates[0].content.parts[0].function_call        if not call:            print("No function call; GPT said:", response.text)            return        print(" Gemini wants:", call.name, call.args)        result = await mcp_client.call_tool(call.name, call.args)        print("\n Tool result:\n", result)asyncio.get_event_loop().run_until_complete(run_gemini(37.7749, -122.4194))

Finally, this async function run_gemini opens an MCP client session over our in‑process transport, sends a natural‑language prompt to Gemini asking for a 3‑day forecast at the given coordinates, captures the resulting function call (if any), invokes the corresponding MCP tool, and prints out the structured weather data, all of which is kicked off by running it in the notebook’s event loop with run_until_complete.

In conclusion, we have a fully contained pipeline that showcases how to define custom MCP tools in Python, expose them via FastMCP, and seamlessly integrate them with Google’s Gemini 2.0 model using the google‑genai client. The key frameworks, FastMCP for MCP hosting, FastMCPTransport and MCPClient for transport and invocation, httpx for external API access, and nest_asyncio for Colab compatibility, work together to enable real‑time function calling without external processes or stdio pipes. This pattern simplifies local development and testing of MCP integrations in Colab and provides a template for building more advanced agentic applications that combine LLM reasoning with specialized domain tools.


Here is the Colab Notebook. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. Don’t Forget to join our 90k+ ML SubReddit.

[Register Now] miniCON Virtual Conference on AGENTIC AI: FREE REGISTRATION + Certificate of Attendance + 4 Hour Short Event (May 21, 9 am- 1 pm PST) + Hands on Workshop

The post A Step-by-Step Coding Guide to Defining Custom Model Context Protocol (MCP) Server and Client Tools with FastMCP and Integrating Them into Google Gemini 2.0’s Function‑Calling Workflow appeared first on MarkTechPost.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Gemini 2.0 FastMCP MCP 函数调用 Python
相关文章