Home » GPT Function Calling: 5 Underrated Use Cases | by Max Brodeur-Urbas | Nov, 2023

GPT Function Calling: 5 Underrated Use Cases | by Max Brodeur-Urbas | Nov, 2023

by Icecream
0 comment

OpenAI’s backend changing messy unstructured knowledge to structured knowledge through capabilities

OpenAI’s “Function Calling” is perhaps probably the most groundbreaking but underneath appreciated characteristic launched by any software program firm… ever.

Functions permit you to flip unstructured knowledge into structured knowledge. This won’t sound all that groundbreaking however when you think about that 90% of knowledge processing and knowledge entry jobs worldwide exist for this precise cause, it’s fairly a revolutionary characteristic that went considerably unnoticed.

Have you ever discovered your self begging GPT (3.5 or 4) to spit out the reply you need and completely nothing else? No “Sure, right here is your…” or every other ineffective fluff surrounding the core reply. GPT Functions are the answer you’ve been on the lookout for.

How are Functions meant to work?

OpenAI’s docs on operate calling are extraordinarily restricted. You’ll end up digging by way of their developer discussion board for examples of use them. I dug across the discussion board for you and have many instance arising.

Here’s one of many solely examples you’ll be capable to discover of their docs:

capabilities = [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
"required": ["location"],
},
}
]

A operate definition is a inflexible JSON format that defines a operate title, description and parameters. In this case, the operate is supposed to get the present climate. Obviously GPT isn’t in a position to name this particular API (because it doesn’t exist) however utilizing this structured response you’d be capable to join the true API hypothetically.

At a excessive stage nonetheless, capabilities present two layers of inference:

Picking the operate itself:

You might discover that capabilities are handed into the OpenAI API name as an array. The cause you present a reputation and outline to every operate are so GPT can determine which to make use of based mostly on a given immediate. Providing a number of capabilities in your API name is like giving GPT a Swiss military knife and asking it to chop a bit of wooden in half. It is aware of that although it has a pair of pliers, scissors and a knife, it ought to use the noticed!

Function definitions contribute in direction of your token depend. Passing in a whole lot of capabilities wouldn’t solely take up the vast majority of your token restrict but additionally end in a drop in response high quality. I usually don’t even use this characteristic and solely go in 1 operate that I pressure it to make use of. It may be very good to have in sure use circumstances nonetheless.

Picking the parameter values based mostly on a immediate:

This is the true magic in my view. GPT with the ability to select the instrument in it’s instrument package is superb and positively the main target of their characteristic announcement however I believe this is applicable to extra use circumstances.

You can think about a operate like handing GPT a kind to fill out. It makes use of its reasoning, the context of the scenario and discipline names/descriptions to determine the way it will fill out every discipline. Designing the shape and the extra data you go in is the place you may get artistic.

GPT filling out your customized kind (operate parameters)

One of the most typical issues I take advantage of capabilities for to extract particular values from a big chunk of textual content. The sender’s handle from an electronic mail, a founders title from a weblog submit, a telephone quantity from a touchdown web page.

I wish to think about I’m trying to find a needle in a haystack besides the LLM burns the haystack, leaving nothing however the needle(s).

GPT Data Extraction Personified.

Use case: Processing hundreds of contest submissions

I constructed an automation that iterated over hundreds of contest submissions. Before storing these in a Google sheet I wished to extract the e-mail related to the submission. Heres the operate name I used for extracting their electronic mail.

{
"title":"update_email",
"description":"Updates electronic mail based mostly on the content material of their submission.",
"parameters":{
"kind":"object",
"properties":{
"electronic mail":{
"kind":"string",
"description":"The electronic mail offered within the submission"
}
},
"required":[
"email"
]
}
}

Assigning unstructured knowledge a rating based mostly on dynamic, pure language standards is an excellent use case for capabilities. You might rating feedback throughout sentiment evaluation, essays based mostly on a customized grading rubric, a mortgage software for threat based mostly on key components. A current use case I utilized scoring to was scoring of gross sales leads from 0–100 based mostly on their viability.

Use Case: Scoring Sales leads

We had a whole lot of potential leads in a single google sheet a number of months in the past that we wished to deal with from most to least necessary. Each lead contained data like firm measurement, contact title, place, business and many others.

Using the next operate we scored every lead from 0–100 based mostly on our wants after which sorted them from greatest to worst.

{
"title":"update_sales_lead_value_score",
"description":"Updates the rating of a gross sales lead and offers a justification",
"parameters":{
"kind":"object",
"properties":{
"sales_lead_value_score":{
"kind":"quantity",
"description":"An integer worth starting from 0 to 100 that represents the standard of a gross sales lead based mostly on these standards. 100 is an ideal lead, 0 is horrible. Ideal Lead Criteria:n- Medium sized corporations (300-500 workers is one of the best vary)n- Companies in major useful resource heavy industries are greatest, ex. manufacturing, agriculture, and many others. (that is a very powerful standards)n- The greater up the contact place, the higher. VP or Executive stage is most well-liked."
},
"score_justification":{
"kind":"string",
"description":"A transparent and conscise justification for the rating offered based mostly on the customized standards"
}
}
},
"required":[
"sales_lead_value_score",
"score_justification"
]
}

Define customized buckets and have GPT thoughtfully think about every bit of knowledge you give it and place it within the right bucket. This can be utilized for labelling duties like deciding on the class of youtube movies or for discrete scoring duties like assigning letter grades to homework assignments.

Use Case: Labelling information articles.

A quite common first step in knowledge processing workflows is separating incoming knowledge into totally different streams. A current automation I constructed did precisely this with information articles scraped from the online. I wished to type them based mostly on the subject of the article and embrace a justification for the choice as soon as once more. Here’s the operate I used:

{
"title":"categorize",
"description":"Categorize the enter knowledge into consumer outlined buckets.",
"parameters":{
"kind":"object",
"properties":{
"class":{
"kind":"string",
"enum":[
"US Politics",
"Pandemic",
"Economy",
"Pop culture",
"Other"
],
"description":"US Politics: Related to US politics or US politicians, Pandemic: Related to the Coronavirus Pandemix, Economy: Related to the economic system of a selected nation or the world. , Pop tradition: Related to popular culture, celeb media or leisure., Other: Doesn't slot in any of the outlined classes. "
},
"justification":{
"kind":"string",
"description":"A brief justification explaining why the enter knowledge was categorized into the chosen class."
}
},
"required":[
"category",
"justification"
]
}
}

Often instances when processing knowledge, I give GPT many doable choices and wish it to pick out one of the best one based mostly on my wants. I solely need the worth it chosen, no surrounding fluff or further ideas. Functions are good for this.

Use Case: Finding the “most fascinating AI information story” from hacker information

I wrote one other medium article right here about how I automated my complete Twitter account with GPT. Part of that course of includes deciding on probably the most related posts from the entrance pages of hacker information. This submit choice step leverages capabilities!

To summarize the capabilities portion of the use case, we’d scrape the primary n pages of hacker information and ask GPT to pick out the submit most related to “AI information or tech information”. GPT would return solely the headline and the hyperlink chosen through capabilities in order that I might go on to scrape that web site and generate a tweet from it.

I’d go within the consumer outlined question as a part of the message and use the next operate definition:

{
"title":"find_best_post",
"description":"Determine one of the best submit that the majority intently displays the question.",
"parameters":{
"kind":"object",
"properties":{
"best_post_title":{
"kind":"string",
"description":"The title of the submit that the majority intently displays the question, said precisely because it seems within the record of titles."
}
},
"required":[
"best_post_title"
]
}
}

Filtering is a subset of categorization the place you categorize gadgets as both true or false based mostly on a pure language situation. A situation like “is Spanish” will be capable to filter out all Spanish feedback, articles and many others. utilizing a easy operate and conditional assertion instantly after.

Use Case: Filtering contest submission

The identical automation that I discussed within the “Data Extraction” part used ai-powered-filtering to weed out contest submissions that didn’t meet the deal-breaking standards. Things like “should use typescript” had been completely necessary for the coding contest at hand. We used capabilities to filter out submissions and trim down the whole set being processed by 90%. Here is the operate definition we used.

{
"title":"apply_condition",
"description":"Used to determine whether or not the enter meets the consumer offered situation.",
"parameters":{
"kind":"object",
"properties":{
"choice":{
"kind":"string",
"enum":[
"True",
"False"
],
"description":"True if the enter meets this situation 'Does submission meet the ALL these necessities (makes use of typescript, makes use of tailwindcss, practical demo)', False in any other case."
}
},
"required":[
"decision"
]
}
}

If you’re curious why I really like capabilities a lot or what I’ve constructed with them you must try AgentHub!

AgentHub is the Y Combinator-backed startup I co-founded that permit’s you automate any repetitive or complicated workflow with AI through a easy drag and drop no-code platform.

“Imagine Zapier however AI-first and on crack.” — Me

Automations are constructed with particular person nodes referred to as “Operators” which can be linked collectively to create energy AI pipelines. We have a listing of AI powered operators that leverage capabilities underneath the hood.

Our present AI-powered operators that use capabilities!

Check out these templates to see examples of operate use-cases on AgentHub: Scoring, Categorization, Option-Selection,

If you wish to begin constructing AgentHub is reside and able to use! We’re very lively in our discord neighborhood and are glad that can assist you construct your automations if wanted.

Feel free to comply with the official AgentHub twitter for updates and myself for AI-related content material.

You may also like

Leave a Comment