Skip to main content

Using OpenAI functions

This walkthrough demonstrates how to incorporate OpenAI function-calling API's in a chain. We'll go over:

  1. How to use functions to get structured outputs from ChatOpenAI
  2. How to create a generic chain that uses (multiple) functions
  3. How to create a chain that actually executes the chosen function

Getting structured outputs

We can take advantage of OpenAI functions to try and force the model to return a particular kind of structured output. We'll use createStructuredOutputRunnable to create our chain, which takes the desired structured output as a valid JSONSchema.

import { ChatPromptTemplate } from "@langchain/core/prompts";
import { ChatOpenAI } from "@langchain/openai";
import { JsonOutputFunctionsParser } from "langchain/output_parsers";

const jsonSchema = {
title: "Person",
description: "Identifying information about a person.",
type: "object",
properties: {
name: { title: "Name", description: "The person's name", type: "string" },
age: { title: "Age", description: "The person's age", type: "integer" },
fav_food: {
title: "Fav Food",
description: "The person's favorite food",
type: "string",
},
},
required: ["name", "age"],
};

const model = new ChatOpenAI();
const prompt = ChatPromptTemplate.fromMessages([
["human", "Human description: {description}"],
]);
const outputParser = new JsonOutputFunctionsParser();

const runnable = prompt
.pipe(model.withStructuredOutput(jsonSchema))
.pipe(outputParser);

const response = await runnable.invoke({
description:
"My name's John Doe and I'm 30 years old. My favorite kind of food are chocolate chip cookies.",
});
console.log(response);
/*
{ name: 'John Doe', age: 30, fav_food: 'chocolate chip cookies' }
*/

API Reference:

tip

You can see a LangSmith trace of this example here.

Creating a generic OpenAI functions chain

To create a generic OpenAI functions chain, we can use the createOpenaiFnRunnable method. This is the same as createStructuredOutputRunnable except that instead of taking a single output schema, it takes a sequence of function definitions.

import { ChatPromptTemplate } from "@langchain/core/prompts";
import { ChatOpenAI } from "@langchain/openai";
import { createOpenAIFnRunnable } from "langchain/chains/openai_functions";
import { JsonOutputFunctionsParser } from "langchain/output_parsers";

const openAIFunction = {
name: "get_person_details",
description: "Get details about a person",
parameters: {
title: "Person",
description: "Identifying information about a person.",
type: "object",
properties: {
name: { title: "Name", description: "The person's name", type: "string" },
age: { title: "Age", description: "The person's age", type: "integer" },
fav_food: {
title: "Fav Food",
description: "The person's favorite food",
type: "string",
},
},
required: ["name", "age"],
},
};

const model = new ChatOpenAI();
const prompt = ChatPromptTemplate.fromMessages([
["human", "Human description: {description}"],
]);
const outputParser = new JsonOutputFunctionsParser();

const runnable = createOpenAIFnRunnable({
functions: [openAIFunction],
llm: model,
prompt,
enforceSingleFunctionUsage: true, // Default is true
outputParser,
});
const response = await runnable.invoke({
description:
"My name's John Doe and I'm 30 years old. My favorite kind of food are chocolate chip cookies.",
});
console.log(response);
/*
{ name: 'John Doe', age: 30, fav_food: 'chocolate chip cookies' }
*/

API Reference:

tip

You can see a LangSmith trace of this example here.

Multiple functions

import { ChatPromptTemplate } from "@langchain/core/prompts";
import { ChatOpenAI } from "@langchain/openai";
import { createOpenAIFnRunnable } from "langchain/chains/openai_functions";
import { JsonOutputFunctionsParser } from "langchain/output_parsers";

const personDetailsFunction = {
name: "get_person_details",
description: "Get details about a person",
parameters: {
title: "Person",
description: "Identifying information about a person.",
type: "object",
properties: {
name: { title: "Name", description: "The person's name", type: "string" },
age: { title: "Age", description: "The person's age", type: "integer" },
fav_food: {
title: "Fav Food",
description: "The person's favorite food",
type: "string",
},
},
required: ["name", "age"],
},
};

const weatherFunction = {
name: "get_weather",
description: "Get the weather for a location",
parameters: {
title: "Location",
description: "The location to get the weather for.",
type: "object",
properties: {
state: {
title: "State",
description: "The location's state",
type: "string",
},
city: {
title: "City",
description: "The location's city",
type: "string",
},
zip_code: {
title: "Zip Code",
description: "The locations's zip code",
type: "number",
},
},
required: ["state", "city"],
},
};

const model = new ChatOpenAI();
const prompt = ChatPromptTemplate.fromMessages([
["human", "Question: {question}"],
]);
const outputParser = new JsonOutputFunctionsParser();

const runnable = createOpenAIFnRunnable({
functions: [personDetailsFunction, weatherFunction],
llm: model,
prompt,
enforceSingleFunctionUsage: false, // Default is true
outputParser,
});
const response = await runnable.invoke({
question: "What's the weather like in Berkeley CA?",
});
console.log(response);
/*
{ state: 'CA', city: 'Berkeley' }
*/

API Reference:

tip

You can see a LangSmith trace of this example here.


Help us out by providing feedback on this documentation page: