Skip to main content

OpenAI functions

Certain models (like OpenAI's gpt-3.5-turbo and gpt-4) have been fine-tuned to detect when a function should be called and respond with the inputs that should be passed to the function. In an API call, you can describe functions and have the model intelligently choose to output a JSON object containing arguments to call those functions.

The OpenAI Functions Agent is designed to work with these models.

Setup

Install the OpenAI integration package, retrieve your key, and store it as an environment variable named OPENAI_API_KEY:

npm install @langchain/openai

This demo also uses Tavily, but you can also swap in another built in tool. You'll need to sign up for an API key and set it as TAVILY_API_KEY.

Initialize Tools

We will first create a tool:

import { TavilySearchResults } from "@langchain/community/tools/tavily_search";

// Define the tools the agent will have access to.
const tools = [new TavilySearchResults({ maxResults: 1 })];

Create Agent

import { AgentExecutor, createOpenAIFunctionsAgent } from "langchain/agents";
import { pull } from "langchain/hub";
import { ChatOpenAI } from "@langchain/openai";
import type { ChatPromptTemplate } from "@langchain/core/prompts";

// Get the prompt to use - you can modify this!
// If you want to see the prompt in full, you can at:
// https://smith.langchain.com/hub/hwchase17/openai-functions-agent
const prompt = await pull<ChatPromptTemplate>(
"hwchase17/openai-functions-agent"
);

const llm = new ChatOpenAI({
model: "gpt-3.5-turbo-1106",
temperature: 0,
});

const agent = await createOpenAIFunctionsAgent({
llm,
tools,
prompt,
});

Run Agent

Now, let's run our agent!

const agentExecutor = new AgentExecutor({
agent,
tools,
});

const result = await agentExecutor.invoke({
input: "what is LangChain?",
});

console.log(result);

/*
{
input: 'what is LangChain?',
output: 'LangChain is an open source project that was launched in October 2022 by Harrison Chase, while working at machine learning startup Robust Intelligence. It is a deployment tool designed to facilitate the transition from LCEL (LangChain Expression Language) prototypes to production-ready applications. LangChain has integrations with systems including Amazon, Google, and Microsoft Azure cloud storage, API wrappers for news, movie information, and weather, Bash for summarization, syntax and semantics checking, and execution of shell scripts, multiple web scraping subsystems and templates, few-shot learning prompt generation support, and more.\n' +
'\n' +
"In April 2023, LangChain incorporated as a new startup and raised over $20 million in funding at a valuation of at least $200 million from venture firm Sequoia Capital, a week after announcing a $10 million seed investment from Benchmark. The project quickly garnered popularity, with improvements from hundreds of contributors on GitHub, trending discussions on Twitter, lively activity on the project's Discord server, many YouTube tutorials, and meetups in San Francisco and London.\n" +
'\n' +
'For more detailed information, you can visit the [LangChain Wikipedia page](https://en.wikipedia.org/wiki/LangChain).'
}
*/

Using with chat history

For more details, see this section of the agent quickstart.

import { AIMessage, HumanMessage } from "@langchain/core/messages";

const result2 = await agentExecutor.invoke({
input: "what's my name?",
chat_history: [
new HumanMessage("hi! my name is cob"),
new AIMessage("Hello Cob! How can I assist you today?"),
],
});

console.log(result2);

/*
{
input: "what's my name?",
chat_history: [
HumanMessage {
content: 'hi! my name is cob',
name: undefined,
additional_kwargs: {}
},
AIMessage {
content: 'Hello Cob! How can I assist you today?',
name: undefined,
additional_kwargs: {}
}
],
output: 'Your name is Cob. How can I assist you today, Cob?'
}
*/

Help us out by providing feedback on this documentation page: