Fun with Functions: (Almost) Everything About GPT API Functions

Benjamin De Kraker
11 min readJun 14, 2023

--

OpenAI has announced support for “Functions,” a new feature within the GPT-3.5 and GPT-4 APIs.

There’s been both excitement and confusion over what these are, and how they work. Let’s dive in, starting with a top-level overview and then getting into actual code and working examples.

OpenAI officially announced Functions on June 13, 2023.

What is a Function?

You can think of a Function as a “helper” that watches your GPT API prompts, and swings into action if it notices a prompt where it can “help out.”

You — the developer — define which Functions are available to your users, and you determine what happens when one is used.

Here’s an example. Say you’re building a chatbot which uses the GPT-3.5 API on the backend to process user queries. You want to add a feature that tells the user the weather if they happen to ask — but that’s not the only thing your app does. It’s just one aspect of a larger chatbot.

What the Function “helper” does is watch the messages from users for one that looks like it can be answered or addressed using extra code. For normal queries, that extra code isn’t run and the GPT API responds normally.

But — if suddenly the user asks a question that seems to clearly fit a Function you set up, the “helper” jumps in. It lets you know that it can use code to be useful, and it formats text (or data) to fit some pre-set requirements.

Here’s the weather example: If the user types, “How fast can penguins swim,” this will get handled by the GPT API like any other prompt.

But if it sees that the user types, “What is the weather in Los Angeles?” it jumps in to help, because it recognizes that this query matches a Function you told it to watch out for.

When that happens, the API returns a trigger or a hook that your code can use to specially-handle the Function. It doesn’t run any code automatically (unless you’ve coded your app to do so). But it assumes you’ll probably want to use the Function and gets the data ready to be processed.

What ISN’T a Function?

Let’s get a few things out of the way. A Function isn’t exactly the same as the Plugins, which are available for use within ChatGPT.

They are somewhat the same concept, but currently these are different things. You can’t use Functions to directly call an existing Plugin from the GPT API, for example. But the general way Plugins work is similar to how Functions work.

A Function also isn’t something that runs on OpenAI’s side of things. To continue the weather example, it isn’t GPT itself that will be fetching the weather data.

The actual code that you want to run when a Function is activated is part of your application — and you have control over what happens, or doesn’t happen, whenever a Function triggers.

All it does is generate text and put it into JSON. What you do with that is up to you. From the official OpenAI documentation:

“The Chat Completions API does not call the function; instead, the model generates JSON that you can use to call the function in your code.”
That’s important to understand.

What does a Function look like?

The main part of a Function is an array. Yep, that’s it — just a big ‘ol array of text data that gets passed on to the GPT API every time you make a call. Here’s what it looks like in CURL:

  '{
"model": "gpt-3.5-turbo-0613",
"messages": [
{"role": "user", "content": "What is the weather like in Boston?"},
{"role": "assistant", "content": null, "function_call": {"name": "get_current_weather", "arguments": "{ \"location\": \"Boston, MA\"}"}},
{"role": "function", "name": "get_current_weather", "content": "{\"temperature\": "22", \"unit\": \"celsius\", \"description\": \"Sunny\"}"}
],
"functions": [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
]
}'

…and here’s a slightly different version as a Python dictionary:

functions = [
{
"name": "get_current_weather",
"description": "Get the current weather.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The four-digit ICAO Airport code for a METAR weather station. \
Example: KSEA. If Airport code is not provided, infer it from other available location data \
and your knowledge.",
}
},
"required": ["location"]
}
}
]

If you have more than one Function, you’d define each and every one here in this array. (This can get long, pretty quickly. More on that later.)

When you make your API call, GPT will literally read the plain-text of what each Function is supposed to do and what format of data it will be dealing with.

Name: This is the specific and precise name of this Function, and is used to refer to the Function at other times. It must be unique within your program.

Description: A natural-language description of what your Function is supposed to do.

Parameters: These define what title, type of data, and format are required to be provided by GPT to your code when the Function runs.

In the above Python example, the code (not pictured) is expecting a four-digit airport code like KSEA for Seattle–Tacoma International Airport — so that’s exactly what we tell the API it will need to provide (as a text string) for the parameter “location.”

The above example also tells the API that “location” is required — after all, we can’t really fetch any weather data without knowing where.

Code Examples

Alright, enough rambing, let’s look at some actual code examples. Further down, we’ll also look at the API response in detail so you can understand what’s going on.

Python:

query = input("Enter your query: ")   

apiKey = "YOUR-KEY"
url = "https://api.openai.com/v1/chat/completions"

headers = {
"Content-Type": "application/json",
"Authorization": "Bearer " + apiKey
}

# GPT will read this to understand what Functions we define.
functions = [{
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The four-digit ICAO Airport code for a METAR weather station. \
Example: KSEA. If Airport code is not provided, infer it from other available location data \
and your knowledge."
},
},
"required": ["location"]
}
}]

# session is a dictionary
messages = session.get('messages', [])
if not messages:
messages.append({"role": "system", "content": "You are a helpful assistant."})

messages.append({"role": "user", "content": query})

# Put all the data together to make the API call
data = {
"model": "gpt-3.5-turbo-0613",
"messages": messages,
"functions": functions
}

This should look generally familiar if you’ve used the GPT API before. We’re preparing the API call normally — auth key goes in the Headers, URL endpoint gets set, message context (if any) is included.

The obvious important things are, again, the giant block of Functions, the model set to “gpt-3.5-turbo-0613” so we can use Functions, and the addition of “functions” as a parameter in the data.

More boilerplate Python to actually make the API call and get a response:

import requests
import json

print(json.dumps(data, indent=4))

response = requests.post(url, headers=headers, json=data)

if response.status_code != 200:
print('Error:', response.text)
return 'Error: ' + response.text, 500

response_body = response.json()

Response Examples

Before we look at a bit more code, let’s see what the response examples would actually look like.

These are real responses from the API, pretty-printed so you can see what’s going on. (This is from a Python test script, but response structure should be the same whether you use PHP, JS, or whatever.)


"model": "gpt-3.5-turbo-0613",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Hello."
}
]

Again, you should recognize these patterns if you’ve used the GPT API before. The session begins with a system message (“You are a helpful assistant”), and then we send a user message to the API (“Hello.”)

Here’s the full API response to the “Hello” message:


"id": "chatcmpl-7REdgBnrwaIPQrkZhWPBRrP27K1wI",
"object": "chat.completion",
"created": 1686725484,
"model": "gpt-3.5-turbo-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hi there! How can I assist you today?"
},
"finish_reason": "stop"
}
]

The GPT API responds with, “Hello! How can I assist you today?”
This is what we expect — nothing in the user’s query has suggested weather, so it has no interest in triggering that Function even though it’s defined in the API data.

Make a note of [finish_reason] => stop . That’s a useful parameter that we will use in a moment to detect if the API wants to use a Function.

Now let’s ask it about weather:

data = [

{
"role": "user",
"content": "Get weather for Detroit."
}

]

Here is the full API response:


"id": "chatcmpl-7REiQGQ1WEzRcB0CrIV3GVMxRUnwa",
"object": "chat.completion",
"created": 1686725778,
"model": "gpt-3.5-turbo-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "",
"function_call": {
"name": "get_current_weather",
"arguments": {
"location": "KDTW"
}
}
},
"finish_reason": "function_call"
}
]

Now that’s different! Notice a few new things?

It’s still responding as [role] => assistant, but now there’s a new response parameters named [function_call] — and that contains this array:

"name": "get_current_weather",
"arguments": {
"location": "KDTW"
}

That’s what we wanted! What this response tell us is, GPT saw that the user is asking about weather — and it knows that we have a Function called get_current_weather that we probably want to use.

Here’s what the structure of GPT API response looks like when it returns a Function: “Name” and “Arguments” are what you’d then use to run your coded function.

It has pre-parsed the “location” as “KDTW.” But… we asked for Detroit. Why doesn’t it say Detroit?

This is where Functions are very clever. Wayyyy up there when we defined the Function, we told it:

"location": {
"type": "string",
"description": "The four-digit ICAO Airport code for a METAR weather station. \
Example: KSEA. If Airport code is not provided, infer it from other available location data \
and your knowledge.",
}

Do you see what’s happening and why it’s important?

We told it in the definition of the Function to expect a four-digit ICAO Airport code for the location. We also told it to use its knowledge if that code isn’t expressly input by the user.

GPT is smart enough to realize that the ICAO airport code for Detroit is — you guessed it — KDTW. So without us needed to hand-code it, GPT prepared the Function parameter to match. Neat!

Glance back up at the full API response. Do you see this?

"finish_reason": "function_call"

That’s important. That is the API telling us, “The reason I stopped is because I think we want to call a function here.”

We can use that! That’s a perfect trigger for our code. We can easily detect when an API response lists a finish_reason as “function_call”, and execute a branch of our code to deal with it.

Back to our Python:

        #navigate down the response array / dicionary tree structure

if 'choices' in responseBody and isinstance(responseBody['choices'], list) and len(responseBody['choices']) > 0:
firstChoice = responseBody['choices'][0]

#take the 'finish_reason' parameter and put it in a variable

finishReason = firstChoice.get('finish_reason', None)

# This is what we're looking for!
# if finishReason is 'function_call' do some stuff:

if finishReason == 'function_call':
print('GPT thinks this is a Function call.')

#save the function name from the API response into a variable

function_name = response['choices'][0]['message']['function_call']['name']

Now we can execute more code specifically to handle the Function response, because “finish_reason”: “function_call” in the API response told us that’s what we’re dealing with.

This is an important point: GPT does not execute (automatically or otherwise) any code from a Function.

All it does is return text as part of the API response which you can then use to do whatever you want — or chose not to execute anything at all.

Again: GPT will not auto-execute any code. It’s up to you to include linting and logic to handle the response on your end.

Functions within your code

So, what does the program actually do once it has identified that we probably want to use a Function? That’s up to you!

This is where you’ll need to code the Function inside your application — a real, proper function. A code function, not a GPT function. (Yes, it’s slightly confusing.)

Here’s some example Python showing how you can use code to handle this:

#put the API response for location into a variable location
location = response['choices'][0]['message']['function_call']['arguments']['location']

if function_name == "get_current_weather":
# Do stuff for this function
print("The function name is get_current_weather.")

You would of course probably want to do something more advanced in the code, like fetch an external weather API and pass along the location (KDTW, in our example), then read the response.

Here’s a quick example of making that external API call to get weather data for the airport code:

if function_name == "get_current_weather":
# Do stuff for this function
print("The function name is get_current_weather.")

import requests
import urllib.parse

# Set up the URL, passing on the location variable such as KDTW
url = f'https://www.aviationweather.gov/adds/dataserver_current/httpparam?dataSource=metars&requestType=retrieve&format=xml&hoursBeforeNow=3&mostRecent=true&stationString={location}'

# Use requests to fetch the URL
response = requests.get(url)

# Check if the request was successful
if response.status_code == 200:
function_response = response.text
print(function_response)

else:
print("Error making the request.")

This would return a response from the weather API which might look like this:

KDTW 140415Z AUTO 00000KT 10SM CLR 09/09 A2958 RMK AO2 T00910086

(Don’t worry, this is in METAR format and GPT can understand it just fine.)

Returning Responses Back to GPT

Great! So now we’ve recognized that the user wants weather data, invoked a section of code to handle this, fetched the real-world weather from an external API… now what?!

We can pass that weather data right back to the GPT API, tagging it as the Function response so that it can parse it into plain English for the user.

           model="gpt-3.5-turbo-0613",
messages=[
{"role": "user", "content": "Get weather for Detroit."},
message,
{
"role": "function",
"name": function_name,
"content": function_response,
},
]

You would then make the standard GPT API call with the above settings.

The message parameters

“role”: “function”, “name”: function_name, “content”: function_response

are telling GPT that this is the returned response from the function, with the function_name still holding the value “get_current_weather” from before, and function_response holding the METAR weather data.

    {
"role": "function",
"name": "get_current_weather",
"content": "KDTW 140415Z AUTO 00000KT 10SM CLR 09/09 A2958 RMK AO2 T00910086"
}

And here’s the response from the GPT API! It can read the weather data just fine, and helpfully parses it into human-readable text:

[0] => Array
(
[index] => 0
[message] => Array
(
[role] => assistant
[content] => The current weather at the location with the airport code KDTW is as follows:

- Temperature: 9°C
- Dewpoint: 9°C
- Visibility: 10 miles
- Wind: Calm
- Sky conditions: Clear
- Altimeter: 29.58 inHg

Please note that this information is subject to change.
)

[finish_reason] => stop

Token Usage

Remeber that we mentioned that the GPT Function definition gets passed on to the API every time a request is sent?

This is one thing to consider: This text counts as tokens (and therefore API fees). The more Functions you define, the longer this text is — and the more tokens you use.

For a fairly inexpensive model like GPT-3.5, this may not be a huge problem for most users. However, the extra token usage is a consideration if you’re using a more costly model like GPT-4.

Conclusion

GPT API Functions may not be a magic solution, but they can be a useful way to add features to your application and handle user requests in elegant ways.

Give them a try — and happy coding!

--

--

Benjamin De Kraker
Benjamin De Kraker

Written by Benjamin De Kraker

Full-Stack Developer. Building ChatGPT Plugins and AI experiments.

Responses (2)