Integrations: Chat Models
LangChain offers a number of Chat Models implementations that integrate with various model providers. These are:
ChatOpenAI
import { ChatOpenAI } from "langchain/chat_models/openai";
const model = new ChatOpenAI({
temperature: 0.9,
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
});
API Reference:
- ChatOpenAI from
langchain/chat_models/openai
Azure ChatOpenAI
import { ChatOpenAI } from "langchain/chat_models/openai";
const model = new ChatOpenAI({
temperature: 0.9,
azureOpenAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.AZURE_OPENAI_API_KEY
azureOpenAIApiInstanceName: "YOUR-INSTANCE-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_INSTANCE_NAME
azureOpenAIApiDeploymentName: "YOUR-DEPLOYMENT-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME
azureOpenAIApiVersion: "YOUR-API-VERSION", // In Node.js defaults to process.env.AZURE_OPENAI_API_VERSION
});
API Reference:
- ChatOpenAI from
langchain/chat_models/openai
ChatAnthropic
import { ChatAnthropic } from "langchain/chat_models/anthropic";
const model = new ChatAnthropic({
temperature: 0.9,
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.ANTHROPIC_API_KEY
});
API Reference:
- ChatAnthropic from
langchain/chat_models/anthropic
Google Vertex AI
The Vertex AI implementation is meant to be used in Node.js and not directly from a browser, since it requires a service account to use.
Before running this code, you should make sure the Vertex AI API is enabled for the relevant project and that you've authenticated to Google Cloud using one of these methods:
- You are logged into an account (using
gcloud auth application-default login
) permitted to that project. - You are running on a machine using a service account that is permitted to the project.
- You have downloaded the credentials for a service account that is permitted
to the project and set the
GOOGLE_APPLICATION_CREDENTIALS
environment variable to the path of this file.
- npm
- Yarn
- pnpm
npm install google-auth-library
yarn add google-auth-library
pnpm add google-auth-library
The ChatGoogleVertexAI class works just like other chat-based LLMs, with a few exceptions:
- The first
SystemChatMessage
passed in is mapped to the "context" parameter that the PaLM model expects. No otherSystemChatMessages
are allowed. - After the first
SystemChatMessage
, there must be an odd number of messages, representing a conversation between a human and the model. - Human messages must alternate with AI messages.
import { ChatGoogleVertexAI } from "langchain/chat_models/googlevertexai";
const model = new ChatGoogleVertexAI({
temperature: 0.7,
});
API Reference:
- ChatGoogleVertexAI from
langchain/chat_models/googlevertexai
There is also an optional examples
constructor parameter that can help the model understand what an appropriate response
looks like.
import { ChatGoogleVertexAI } from "langchain/chat_models/googlevertexai";
import {
AIChatMessage,
HumanChatMessage,
SystemChatMessage,
} from "langchain/schema";
export const run = async () => {
const examples = [
{
input: new HumanChatMessage("What is your favorite sock color?"),
output: new AIChatMessage("My favorite sock color be arrrr-ange!"),
},
];
const model = new ChatGoogleVertexAI({
temperature: 0.7,
examples,
});
const questions = [
new SystemChatMessage(
"You are a funny assistant that answers in pirate language."
),
new HumanChatMessage("What is your favorite food?"),
];
// You can also use the model as part of a chain
const res = await model.call(questions);
console.log({ res });
};
API Reference:
- ChatGoogleVertexAI from
langchain/chat_models/googlevertexai
- AIChatMessage from
langchain/schema
- HumanChatMessage from
langchain/schema
- SystemChatMessage from
langchain/schema