MarkTechPost@AI 04月23日
A Coding Guide to Build an Agentic AI‑Powered Asynchronous Ticketing Assistant Using PydanticAI Agents, Pydantic v2, and SQLite Database
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文介绍如何使用PydanticAI库构建一个基于Agentic AI的端到端工单助手。该助手利用Pydantic v2模型定义数据规则,将工单存储在内存中的SQLite数据库中,并使用Python的uuid模块生成唯一的标识符。后台有两个代理,一个用于创建工单,一个用于检查状态,它们通过Google Gemini解释自然语言提示并调用自定义数据库函数。最终实现一个干净、类型安全的工作流程,可在Colab中立即运行。

⚙️ 首先,安装并导入必要的库,包括用于更新pip、安装PydanticAI、处理异步操作的nest_asyncio,以及用于数据库和UUID生成的sqlite3、uuid,以及Pydantic和PydanticAI的相关模块,为构建工单助手做好准备。

🔑 为了安全地使用Google Gemini API,需要设置GEMINI_API_KEY环境变量,确保API密钥的安全性。在Colab等环境中,通过getpass函数提示用户输入API密钥,并将其存储在os.environ中,以便AI代理自动进行身份验证。

🗄️ 文章使用SQLite数据库存储工单数据,并定义了tickets表,包含ticket_id、summary、severity、department和status等字段。同时,使用TicketingDependencies类传递数据库连接,并使用CreateTicketOutput和TicketStatusOutput Pydantic模型定义数据结构,确保数据的有效性和一致性。

🤖 创建create_agent和status_agent两个PydanticAI代理,前者用于创建工单,后者用于查询工单状态。每个代理都与Google Gemini连接,并定义了相应的工具函数,这些函数会根据自然语言提示调用数据库操作,并返回经过验证的数据,实现工单的创建和状态查询功能。

✅ 最后,通过deps变量将SQLite连接传递给代理,使用自然语言提示创建工单,并打印JSON格式的工单数据。然后,获取工单ID,查询工单状态,并打印JSON格式的状态信息,展示了Agentic AI和PydanticAI协同工作,实现工单自动化处理的流程。

In this tutorial, we’ll build an end‑to‑end ticketing assistant powered by Agentic AI using the PydanticAI library. We’ll define our data rules with Pydantic v2 models, store tickets in an in‑memory SQLite database, and generate unique identifiers with Python’s uuid module. Behind the scenes, two agents, one for creating tickets and one for checking status, leverage Google Gemini (via PydanticAI’s google-gla provider) to interpret your natural‑language prompts and call our custom database functions. The result is a clean, type‑safe workflow you can run immediately in Colab.

!pip install --upgrade pip!pip install pydantic-ai

First, these two commands update your pip installer to the latest version, bringing in new features and security patches, and then install PydanticAI. This library enables the definition of type-safe AI agents and the integration of Pydantic models with LLMs.

import osfrom getpass import getpassif "GEMINI_API_KEY" not in os.environ:    os.environ["GEMINI_API_KEY"] = getpass("Enter your Google Gemini API key: ")

We check whether the GEMINI_API_KEY environment variable is already set. If not, we securely prompt you (without echoing) to enter your Google Gemini API key at runtime, then store it in os.environ so that your Agentic AI calls can authenticate automatically.

We install the nest_asyncio package, which lets you patch the existing asyncio event loop so that you can call async functions (or use .run_sync()) inside environments like Colab without running into “event loop already running” errors.

import sqlite3import uuidfrom dataclasses import dataclassfrom typing import Literalfrom pydantic import BaseModel, Fieldfrom pydantic_ai import Agent, RunContext

We bring in Python’s sqlite3 for our in‑memory database and uuid to generate unique ticket IDs, use dataclass and Literal for clear dependency and type definitions, and load Pydantic’s BaseModel/​Field for enforcing data schemas alongside Agent and RunContext from PydanticAI to wire up and run our conversational agents.

conn = sqlite3.connect(":memory:")conn.execute("""CREATE TABLE tickets (    ticket_id TEXT PRIMARY KEY,    summary   TEXT NOT NULL,    severity  TEXT NOT NULL,    department TEXT NOT NULL,    status    TEXT NOT NULL)""")conn.commit()

We set up an in‑memory SQLite database and define a tickets table with columns for ticket_id, summary, severity, department, and status, then commit the schema so you have a lightweight, transient store for managing your ticket records.

@dataclassclass TicketingDependencies:    """Carries our DB connection into system prompts and tools."""    db: sqlite3.Connectionclass CreateTicketOutput(BaseModel):    ticket_id: str = Field(..., description="Unique ticket identifier")    summary: str   = Field(..., description="Text summary of the issue")    severity: Literal["low","medium","high"] = Field(..., description="Urgency level")    department: str = Field(..., description="Responsible department")    status: Literal["open"] = Field("open", description="Initial ticket status")class TicketStatusOutput(BaseModel):    ticket_id: str = Field(..., description="Unique ticket identifier")    status: Literal["open","in_progress","resolved"] = Field(..., description="Current ticket status")

Here, we define a simple TicketingDependencies dataclass to pass our SQLite connection into each agent call, and then declare two Pydantic models: CreateTicketOutput (with fields for ticket ID, summary, severity, department, and default status “open”) and TicketStatusOutput (with ticket ID and its current status). These models enforce a clear, validated structure on everything our agents return, ensuring you always receive well-formed data.

create_agent = Agent(    "google-gla:gemini-2.0-flash",    deps_type=TicketingDependencies,    output_type=CreateTicketOutput,    system_prompt="You are a ticketing assistant. Use the `create_ticket` tool to log new issues.")@create_agent.toolasync def create_ticket(    ctx: RunContext[TicketingDependencies],    summary: str,    severity: Literal["low","medium","high"],    department: str) -> CreateTicketOutput:    """    Logs a new ticket in the database.    """    tid = str(uuid.uuid4())    ctx.deps.db.execute(        "INSERT INTO tickets VALUES (?,?,?,?,?)",        (tid, summary, severity, department, "open")    )    ctx.deps.db.commit()    return CreateTicketOutput(        ticket_id=tid,        summary=summary,        severity=severity,        department=department,        status="open"    )

We create a PydanticAI Agent named’ create_agent’ that’s wired to Google Gemini and is aware of our SQLite connection (deps_type=TicketingDependencies) and output schema (CreateTicketOutput). The @create_agent.tool decorator then registers an async create_ticket function, which generates a UUID, inserts a new row into the tickets table, and returns a validated CreateTicketOutput object.

status_agent = Agent(    "google-gla:gemini-2.0-flash",    deps_type=TicketingDependencies,    output_type=TicketStatusOutput,    system_prompt="You are a ticketing assistant. Use the `get_ticket_status` tool to retrieve current status.")@status_agent.toolasync def get_ticket_status(    ctx: RunContext[TicketingDependencies],    ticket_id: str) -> TicketStatusOutput:    """    Fetches the ticket status from the database.    """    cur = ctx.deps.db.execute(        "SELECT status FROM tickets WHERE ticket_id = ?", (ticket_id,)    )    row = cur.fetchone()    if not row:        raise ValueError(f"No ticket found for ID {ticket_id!r}")    return TicketStatusOutput(ticket_id=ticket_id, status=row[0])

We set up a second PydanticAI Agent, status_agent, also using the Google Gemini provider and our shared TicketingDependencies. It registers an async get_ticket_status tool that looks up a given ticket_id in the SQLite database and returns a validated TicketStatusOutput, or raises an error if the ticket isn’t found.

deps = TicketingDependencies(db=conn)create_result = await create_agent.run(    "My printer on 3rd floor shows a paper jam error.", deps=deps)print("Created Ticket →")print(create_result.output.model_dump_json(indent=2))tid = create_result.output.ticket_idstatus_result = await status_agent.run(    f"What's the status of ticket {tid}?", deps=deps)print("Ticket Status →")print(status_result.output.model_dump_json(indent=2))

Finally, we first package your SQLite connection into deps, then ask the create_agent to log a new ticket via a natural‑language prompt, printing the validated ticket data as JSON. It then takes the returned ticket_id, queries the status_agent for that ticket’s current state, and prints the status in JSON form.

In conclusion, you have seen how Agentic AI and PydanticAI work together to automate a complete service process, from logging a new issue to retrieving its live status, all managed through conversational prompts. Our use of Pydantic v2 ensures every ticket matches the schema you define, while SQLite provides a lightweight backend that’s easy to replace with any database. With these tools in place, you can expand the assistant, adding new agent functions, integrating other AI models like openai:gpt-4o, or connecting real‑world APIs, confident that your data remains structured and reliable throughout.


Here is the Colab Notebook. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. Don’t Forget to join our 90k+ ML SubReddit.

[Register Now] miniCON Virtual Conference on AGENTIC AI: FREE REGISTRATION + Certificate of Attendance + 4 Hour Short Event (May 21, 9 am- 1 pm PST) + Hands on Workshop

The post A Coding Guide to Build an Agentic AI‑Powered Asynchronous Ticketing Assistant Using PydanticAI Agents, Pydantic v2, and SQLite Database appeared first on MarkTechPost.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

PydanticAI Agentic AI 工单助手 Google Gemini
相关文章