OpenAI ChatGPT-4开发笔记2024-03:Chat之Function Calling/Function/Tool/Tool_Choice

2024-01-07 17:11:09

Updates on Function Calling were a major highlight at OpenAI DevDay.

In another world,原来的function call都不再正常工作了,必须全部重写。

function和function call全部由tool和tool_choice取代。2023年11月之前关于function call的代码都准备翘翘。

干嘛要整个tool出来取代function呢?原因有很多,不再赘述。作为程序员,我们真正关心的是:怎么改?

简单来说,就是整合chatgpt的能力和你个人的能力通过这个tools。怎么做呢?

第一步,定义你的function,最高指示是啥?

import json
from openai import OpenAI
client = OpenAI()

# Example dummy function hard coded to return the same weather
# In production, this could be your backend API or an external API
def get_current_weather(location, unit="fahrenheit"):
    """Get the current weather in a given location"""
    if "beijing" in location.lower():
        return json.dumps({"location": location, "temperature": "10", "unit": "celsius"})
    elif "tokyo" in location.lower():
        return json.dumps({"location": location, "temperature": "22", "unit": "celsius"})
    elif "shanghai" in location.lower():
        return json.dumps({"location": location, "temperature": "21", "unit": "celsius"})
    elif "san francisco" in location.lower():
        return json.dumps({"location": location, "temperature": "72", "unit": "fahrenheit"})
    else:
        return json.dumps({"location": location, "temperature": "22.22", "unit": "celsius"})

第二步,调用chatgpt模型

让chatgpt干活儿。问问chatgpt啥情况

def run_conversation():
    # Step 1: send the conversation and available functions to the model
    messages = [{"role": "user", "content": "What's the weather like in San Francisco, Tokyo, Beijing and Paris?"}]
    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_current_weather",
                "description": "Get the current weather in a given location",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {
                            "type": "string",
                            "description": "The city and state, e.g. San Francisco, CA",
                        },
                        "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                    },
                    "required": ["location"],
                },
            },
        }
    ]
    
    response = client.chat.completions.create(
        model="gpt-3.5-turbo-1106",
        messages=messages,
        tools = tools,
        tool_choice="auto",  # auto is default, but we'll be explicit
    )
    response_message = response.choices[0].message
    tool_calls = response_message.tool_calls

tool_choice参数让chatgpt模型自行决断是否需要function介入。
response是返回的object,message里包含一个tool_calls array.

tool_calls array The tool calls generated by the model, such as function calls.
id string The ID of the tool call.
type string The type of the tool. Currently, only function is supported.
function object:  The function that the model called.
	name: string The name of the function to call.
	arguments: string The arguments to call the function with, as generated by the model in JSON format. Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function.

第三步,chatgpt判断如果需要function介入,传回一个json对象。

    # Step 2: check if the model wanted to call a function
    if tool_calls:
        # Step 3: call the function
        # Note: the JSON response may not always be valid; be sure to handle errors
        available_functions = {
            "get_current_weather": get_current_weather,
        }  # only one function in this example, but you can have multiple
        messages.append(response_message)  # extend conversation with assistant's reply
        # Step 4: send the info for each function call and function response to the model
        for tool_call in tool_calls:
            function_name = tool_call.function.name
            function_to_call = available_functions[function_name]
            function_args = json.loads(tool_call.function.arguments)
            function_response = function_to_call(
                location=function_args.get("location"),
                unit=function_args.get("unit"),
            )
            messages.append(
                {
                    "tool_call_id": tool_call.id,
                    "role": "tool",
                    "name": function_name,
                    "content": function_response,
                }
            )  # extend conversation with function response
        second_response = openai.chat.completions.create(
            model="gpt-3.5-turbo-1106",
            messages=messages,
        )  # get a new response from the model where it can see the function response
        return second_response
print(run_conversation())    

我们把这个传回的json,叠加在message里面,再调用chatgpt模型。得出结果:

ChatCompletion(id='chatcmpl-8ciuEU38jFKJcjEbQH66ejGNnp0kO', 
choices=[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(
content="Currently, the weather in San Francisco, California is 72°F (22°C) with a slight breeze. In Tokyo, Japan, the temperature is 22°C with partly cloudy skies. In Beijing, China, it's 10°C with overcast conditions. And in Paris, France, the temperature is 22.22°C with clear skies.", 
role='assistant', function_call=None, tool_calls=None))], 
created=1704239774, model='gpt-3.5-turbo-1106', object='chat.completion', system_fingerprint='fp_772e8125bb', usage=CompletionUsage(completion_tokens=71, prompt_tokens=229, total_tokens=300))

tool和tool_choice,取代了过去的function和function calling。
在这里插入图片描述

文章来源:https://blog.csdn.net/longwo888/article/details/135353091
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。