Funcation Calling 就是能够让 LLM 具备访问外部工具,增强 LLM 的能力。要使用 Function Calling 功能,就需要一个具备 Function Calling 能力的 LLM
LLM With Function Calling
具备 Funcation Calling 的 LLM,一般都需要对基础 LLM 进行微调,也就是通过少量的数据进行训练,让 LLM 能够根据 Prompt 识别该使用哪个 Function,将要调用的 Function 和其需要的参数通过 JSON 的方式结构化输出。
Funcation 其实就是类似一个Tool,使用这种 Tool 就可以增强 LLM 的功能,比如:获取上海明天的天气,这种实时数据无法通过 LLM 获取,所以就可以使用 LLM + Tool 来解决这个问题
Understand Function Calling
对于 Function Calling
的误解就是会觉得是 LLM
直接调用 Function
,其实并不是这样的。Function Calling 的实现其实是 LLM 根据提示词判断是不是需要调用 Funcation,这个提示词分为三部分:
- System Prompt:这部分就是告诉 LLM 这个 Agent 的功能是什么User Prompt:这部分是用户输入 问题Tool Description:这部分主要是用于描述一个 Function,以结构化的方式,告诉 LLM 这个 Function 是什么,有哪些参数
将这些 Prompt 传递给 LLM,LLM会根据用户的意图来判断是不是需要调用 Tool,如果需要,那么 LLM 就会返回一个 Json,其中包含要调用的函数的名字和参数。
程序收到返回后,就可以调用 LLM 返回的 Function,并且传入 LLM 返回的函数参数。调用完成之后,再将调用的结果和之前的 Prompt 一起发送给 LLM,LLM就会将结构进行汇总返回给用户
Example
下面就通过获取天气的例子来看下 Funcation Calling 具体是如何实现的。首先,需要初始化一个 LLM,此处以 Azure Openai 为例:
client = openai.AzureOpenAI( azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"), api_key=os.getenv("AZURE_OPENAI_API_KEY"), api_version=os.getenv("AZURE_OPENAI_API_VERSION"),)DEPLOYMENT_NAME = os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME")
接下来写一个获取添加的 Tool
:
def get_current_weather(location, unit="fahrenheit"): """Get the current weather in a given location""" if "tokyo" in location.lower(): return json.dumps({"location": "Tokyo", "temperature": "10", "unit": unit}) elif "san francisco" in location.lower(): return json.dumps({"location": "San Francisco", "temperature": "72", "unit": unit}) elif "paris" in location.lower(): return json.dumps({"location": "Paris", "temperature": "22", "unit": unit}) else: return json.dumps({"location": location, "temperature": "unknown"})
定义 Tool
的描述,使用 llm
调用:
def get_function_and_args(tool_call, available_functions): """ Retrieves the function and its arguments based on the tool call. Verifies if the function exists and has the correct number of arguments. Args: tool_call (ToolCall): The tool call object containing the function name and arguments. available_functions (dict): A dictionary of available functions. Returns: tuple: A tuple containing the function to call and its arguments. If the function or arguments are invalid, returns an error message and None. """ # verify function exists if tool_call.function.name not in available_functions: return "Function " + tool_call.function.name + " does not exist", None function_to_call = available_functions[tool_call.function.name] # verify function has correct number of arguments function_args = json.loads(tool_call.function.arguments) if check_args(function_to_call, function_args) is False: return "Invalid number of arguments for function: " + tool_call.function.name, None return function_to_call, function_argsdef run_conversation(): messages = [ { "role": "system", "content": """ You are a helpful assistant. You have access to a function that can get the current weather in a given location. Determine a reasonable Unit of Measurement (Celsius or Fahrenheit) for the temperature based on the location. """, }, { "role": "user", "content": "What's the weather like in San Francisco, Tokyo?", }, ] tools = [ { "type": "function", "function": { "name": "get_current_weather", "description": """ Get the current weather in a given location. Note: any US cities have temperatures in Fahrenheit """, "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "unit": { "type": "string", "description": "Unit of Measurement (Celsius or Fahrenheit) for the temperature based on the location", "enum": ["celsius", "fahrenheit"], }, }, "required": ["location"], }, }, } ] response = client.chat.completions.create( model=DEPLOYMENT_NAME, messages=messages, tools=tools, tool_choice="auto", # auto is default, but we'll be explicit temperature=0, # Adjust the variance by changing the temperature value (default is 0.8) ) response_message = response.choices[0].message tool_calls = response_message.tool_calls if tool_calls: messages.append(response_message) # extend conversation with assistant's reply available_functions = { "get_current_weather": get_current_weather, } # only one function in this example, but you can have multiple for tool_call in tool_calls: # Step 3: call the function # Note: the JSON response may not always be valid; be sure to handle errors function_name = tool_call.function.name # get the function and arguments function_to_call, function_args = get_function_and_args(tool_call, available_functions) # call the function function_response = function_to_call(**function_args) # Step 4: send the info for each function call and function response to the model messages.append( { "tool_call_id": tool_call.id, "role": "tool", "name": function_name, "content": function_response, } ) # extend conversation with function response second_response = client.chat.completions.create( model=DEPLOYMENT_NAME, messages=messages, temperature=0, # Adjust the variance by changing the temperature value (default is 0.8) ) # get a new response from the model where it can see the function response return second_response
最后写一个:
if __name__ == '__main__': result = run_conversation() message_content = result.choices[0].message.content print(message_content)