MarkTechPost@AI 05月01日 10:20
A Step-by-Step Coding Guide to Integrate Dappier AI’s Real-Time Search and Recommendation Tools with OpenAI’s Chat API
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文详细介绍了如何利用Dappier AI的实时搜索和推荐工具增强对话式应用程序。通过结合Dappier的RealTimeSearchTool和AIRecommendationTool,可以查询最新的网络信息,并从自定义数据模型中获取个性化的文章推荐。文章提供了逐步指南,包括设置Google Colab环境,安装依赖项,安全加载API密钥以及初始化Dappier模块。最后,将这些工具与OpenAI聊天模型(如gpt-3.5-turbo)集成,构建可组合的提示链,并在九个简洁的notebook单元格中执行端到端查询。无论需要最新的新闻检索还是AI驱动的内容管理,本文都为构建智能、数据驱动的聊天体验提供了一个灵活的框架。

🔑Dappier AI提供实时搜索(RealTimeSearchTool)和AI推荐(AIRecommendationTool)工具,用于增强对话式应用,从网络获取最新信息并提供个性化推荐。

🛠️文章详细指导如何在Google Colab环境中设置Dappier AI,包括安装依赖、安全加载API密钥及初始化模块,确保无缝访问Dappier工具、LangChain运行时和OpenAI API。

🔗通过LangChain,将Dappier的实时搜索工具与OpenAI的gpt-3.5-turbo聊天模型集成,构建可组合的提示链(llm_chain),实现端到端的查询,从而驱动智能交互。

💡文章展示了如何通过`search_tool.invoke`直接查询Dappier的实时搜索引擎,以及如何通过`recommendation_tool.invoke`获取相关文章推荐,并结合完整的工具链(tool_chain)进行综合查询。

In this tutorial, we will learn how to harness the power of Dappier AI, a suite of real-time search and recommendation tools, to enhance our conversational applications. By combining Dappier’s cutting-edge RealTimeSearchTool with its AIRecommendationTool, we can query the latest information from across the web and surface personalized article suggestions from custom data models. We guide you step-by-step through setting up our Google Colab environment, installing dependencies, securely loading API keys, and initializing each Dappier module. We will then integrate these tools with an OpenAI chat model (e.g., gpt-3.5-turbo), construct a composable prompt chain, and execute end-to-end queries, all within nine concise notebook cells. Whether we need up-to-the-minute news retrieval or AI-driven content curation, this tutorial provides a flexible framework for building intelligent, data-driven chat experiences.

!pip install -qU langchain-dappier langchain langchain-openai langchain-community langchain-core openai

We bootstrap our Colab environment by installing the core LangChain libraries, both the Dappier extensions and the community integrations, alongside the official OpenAI client. With these packages in place, we will have seamless access to Dappier’s real-time search and recommendation tools, the latest LangChain runtimes, and the OpenAI API, all in one environment.

import osfrom getpass import getpassos.environ["DAPPIER_API_KEY"] = getpass("Enter our Dappier API key: ")os.environ["OPENAI_API_KEY"] = getpass("Enter our OpenAI API key: ")

We securely capture our Dappier and OpenAI API credentials at runtime, thereby avoiding the hard-coding of sensitive keys in our notebook. By using getpass, the prompts ensure our inputs remain hidden, and setting them as environment variables makes them available to all subsequent cells without exposing them in logs.

from langchain_dappier import DappierRealTimeSearchToolsearch_tool = DappierRealTimeSearchTool()print("Real-time search tool ready:", search_tool)

We import Dappier’s real‐time search module and create an instance of the DappierRealTimeSearchTool, enabling our notebook to execute live web queries. The print statement confirms that the tool has been initialized successfully and is ready to handle search requests.

from langchain_dappier import DappierAIRecommendationToolrecommendation_tool = DappierAIRecommendationTool(    data_model_id="dm_01j0pb465keqmatq9k83dthx34",    similarity_top_k=3,    ref="sportsnaut.com",    num_articles_ref=2,    search_algorithm="most_recent",)print("Recommendation tool ready:", recommendation_tool)

We set up Dappier’s AI-powered recommendation engine by specifying our custom data model, the number of similar articles to retrieve, and the source domain for context. The DappierAIRecommendationTool instance will now use the “most_recent” algorithm to pull in the top-k relevant articles (here, two) from our specified reference, ready for query-driven content suggestions.

from langchain.chat_models import init_chat_modelllm = init_chat_model(    model="gpt-3.5-turbo",    model_provider="openai",    temperature=0,)llm_with_tools = llm.bind_tools([search_tool])print(" llm_with_tools ready")

We create an OpenAI chat model instance using gpt-3.5-turbo with a temperature of 0 to ensure consistent responses, and then bind the previously initialized search tool so that the LLM can invoke real-time searches. The final print statement confirms that our LLM is ready to call Dappier’s tools within our conversational flows.

import datetimefrom langchain_core.prompts import ChatPromptTemplatetoday = datetime.datetime.today().strftime("%Y-%m-%d")prompt = ChatPromptTemplate([    ("system", f"we are a helpful assistant. Today is {today}."),    ("human", "{user_input}"),    ("placeholder", "{messages}"),])llm_chain = prompt | llm_with_toolsprint(" llm_chain built")

We construct the conversational “chain” by first building a ChatPromptTemplate that injects the current date into a system prompt and defines slots for user input and prior messages. By piping the template (|) into our llm_with_tools, we create an llm_chain that automatically formats prompts, invokes the LLM (with real-time search capability), and handles responses in a seamless workflow. The final print confirms the chain is ready to drive end-to-end interactions.

from langchain_core.runnables import RunnableConfig, chain@chaindef tool_chain(user_input: str, config: RunnableConfig):    ai_msg = llm_chain.invoke({"user_input": user_input}, config=config)    tool_msgs = search_tool.batch(ai_msg.tool_calls, config=config)    return llm_chain.invoke(        {"user_input": user_input, "messages": [ai_msg, *tool_msgs]},        config=config    )print(" tool_chain defined")

We define an end-to-end tool_chain that first sends our prompt to the LLM (capturing any requested tool calls), then executes those calls via search_tool.batch, and finally feeds both the AI’s initial message and the tool outputs back into the LLM for a cohesive response. The @chain decorator transforms this into a single, runnable pipeline, allowing us to simply call tool_chain.invoke(…) to handle both thinking and searching in a single step.

res = search_tool.invoke({"query": "What happened at the last Wrestlemania"})print(" Search:", res)

We demonstrate a direct query to Dappier’s real-time search engine, asking “What happened at the last WrestleMania,” and immediately print the structured result. It shows how easily we can leverage search_tool.invoke to fetch up-to-the-moment information and inspect the raw response in our notebook.

rec = recommendation_tool.invoke({"query": "latest sports news"})print(" Recommendation:", rec)out = tool_chain.invoke("Who won the last Nobel Prize?")print(" Chain output:", out)

Finally, we showcase both our recommendation and full-chain workflows in action. First, it calls recommendation_tool.invoke with “latest sports news” to fetch relevant articles from our custom data model, then prints those suggestions. Next, it runs the tool_chain.invoke(“Who won the last Nobel Prize?”) to perform an end-to-end LLM query combined with real-time search, printing the AI’s synthesized answer, and integrating live data.

In conclusion, we now have a robust baseline for embedding Dappier AI capabilities into any conversational workflow. We’ve seen how effortlessly Dappier’s real-time search empowers our LLM to access fresh facts, while the recommendation tool enables us to deliver contextually relevant insights from proprietary data sources. From here, we can customize search parameters (e.g., refining query filters) or fine-tune recommendation settings (e.g., adjusting similarity thresholds and reference domains) to suit our domain.


Check out the Dappier Platform and Notebook here. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. Don’t Forget to join our 90k+ ML SubReddit.

[Register Now] miniCON Virtual Conference on AGENTIC AI: FREE REGISTRATION + Certificate of Attendance + 4 Hour Short Event (May 21, 9 am- 1 pm PST) + Hands on Workshop

The post A Step-by-Step Coding Guide to Integrate Dappier AI’s Real-Time Search and Recommendation Tools with OpenAI’s Chat API appeared first on MarkTechPost.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Dappier AI LangChain OpenAI 实时搜索 AI推荐
相关文章